Search results for: corridor densification option model
15888 The Impact of the Composite Expanded Graphite PCM on the PV Panel Whole Year Electric Output: Case Study Milan
Authors: Hasan A Al-Asadi, Ali Samir, Afrah Turki Awad, Ali Basem
Abstract:
Integrating the phase change material (PCM) with photovoltaic (PV) panels is one of the effective techniques to minimize the PV panel temperature and increase their electric output. In order to investigate the impact of the PCM on the electric output of the PV panels for a whole year, a lumped-distributed parameter model for the PV-PCM module has been developed. This development has considered the impact of the PCM density variation between the solid phase and liquid phase. This contribution will increase the assessment accuracy of the electric output of the PV-PCM module. The second contribution is to assess the impact of the expanded composite graphite-PCM on the PV electric output in Milan for a whole year. The novel one-dimensional model has been solved using MATLAB software. The results of this model have been validated against literature experiment work. The weather and the solar radiation data have been collected. The impact of expanded graphite-PCM on the electric output of the PV panel for a whole year has been investigated. The results indicate this impact has an enhancement rate of 2.39% for the electric output of the PV panel in Milan for a whole year.Keywords: PV panel efficiency, PCM, numerical model, solar energy
Procedia PDF Downloads 17715887 Analytical Solution for Stellar Distance Based on Photon Dominated Cosmic Expansion Model
Authors: Xiaoyun Li, Suoang Longzhou
Abstract:
This paper derives the analytical solution of stellar distance according to its redshift based on the photon-dominated universe expansion model. Firstly, it calculates stellar separation speed and the farthest distance of observable stars via simulation. Then the analytical solution of stellar distance according to its redshift is derived. It shows that when the redshift is large, the stellar distance (and its separation speed) is not proportional to its redshift due to the relativity effect. It also reveals the relationship between stellar age and its redshift. The correctness of the analytical solution is verified by the latest astronomic observations of Ia supernovas in 2020.Keywords: redshift, cosmic expansion model, analytical solution, stellar distance
Procedia PDF Downloads 16515886 Knowledge Audit Model for Requirement Elicitation Process
Authors: Laleh Taheri, Noraini C. Pa, Rusli Abdullah, Salfarina Abdullah
Abstract:
Knowledge plays an important role to the success of any organization. Software development organizations are highly knowledge-intensive organizations especially in their Requirement Elicitation Process (REP). There are several problems regarding communicating and using the knowledge in REP such as misunderstanding, being out of scope, conflicting information and changes of requirements. All of these problems occurred in transmitting the requirements knowledge during REP. Several researches have been done in REP in order to solve the problem towards requirements. Knowledge Audit (KA) approaches were proposed in order to solve managing knowledge in human resources, financial, and manufacturing. There is lack of study applying the KA in requirements elicitation process. Therefore, this paper proposes a KA model for REP in supporting to acquire good requirements.Keywords: knowledge audit, requirement elicitation process, KA model, knowledge in requirement elicitation
Procedia PDF Downloads 34915885 Preference for Housing Services and Rational House Price Bubbles
Authors: Stefanie Jeanette Huber
Abstract:
This paper explores the relevance and implications of preferences for housing services on house price fluctuations through the lens of an overlapping generation’s model. The model implies that an economy whose agents have lower preferences for housing services is characterized with lower expenditure shares on housing services and will tend to experience more frequent and more volatile housing bubbles. These model predictions are tested empirically in the companion paper Housing Booms and Busts - Convergences and Divergences across OECD countries. Between 1970 - 2013, countries who spend less on housing services as a share of total income experienced significantly more housing cycles and the associated housing boom-bust cycles were more violent. Finally, the model is used to study the impact of rental subsidies and help-to-buy schemes on rational housing bubbles. Rental subsidies are found to contribute to the control of housing bubbles, whereas help-to- buy scheme makes the economy more bubble-prone.Keywords: housing bubbles, housing booms and busts, preference for housing services, expenditure shares for housing services, rental and purchase subsidies
Procedia PDF Downloads 30215884 Autonomous Quantum Competitive Learning
Authors: Mohammed A. Zidan, Alaa Sagheer, Nasser Metwally
Abstract:
Real-time learning is an important goal that most of artificial intelligence researches try to achieve it. There are a lot of problems and applications which require low cost learning such as learn a robot to be able to classify and recognize patterns in real time and real-time recall. In this contribution, we suggest a model of quantum competitive learning based on a series of quantum gates and additional operator. The proposed model enables to recognize any incomplete patterns, where we can increase the probability of recognizing the pattern at the expense of the undesired ones. Moreover, these undesired ones could be utilized as new patterns for the system. The proposed model is much better compared with classical approaches and more powerful than the current quantum competitive learning approaches.Keywords: competitive learning, quantum gates, quantum gates, winner-take-all
Procedia PDF Downloads 47615883 Predicting Indonesia External Debt Crisis: An Artificial Neural Network Approach
Authors: Riznaldi Akbar
Abstract:
In this study, we compared the performance of the Artificial Neural Network (ANN) model with back-propagation algorithm in correctly predicting in-sample and out-of-sample external debt crisis in Indonesia. We found that exchange rate, foreign reserves, and exports are the major determinants to experiencing external debt crisis. The ANN in-sample performance provides relatively superior results. The ANN model is able to classify correctly crisis of 89.12 per cent with reasonably low false alarms of 7.01 per cent. In out-of-sample, the prediction performance fairly deteriorates compared to their in-sample performances. It could be explained as the ANN model tends to over-fit the data in the in-sample, but it could not fit the out-of-sample very well. The 10-fold cross-validation has been used to improve the out-of-sample prediction accuracy. The results also offer policy implications. The out-of-sample performance could be very sensitive to the size of the samples, as it could yield a higher total misclassification error and lower prediction accuracy. The ANN model could be used to identify past crisis episodes with some accuracy, but predicting crisis outside the estimation sample is much more challenging because of the presence of uncertainty.Keywords: debt crisis, external debt, artificial neural network, ANN
Procedia PDF Downloads 44615882 Clustering Performance Analysis using New Correlation-Based Cluster Validity Indices
Authors: Nathakhun Wiroonsri
Abstract:
There are various cluster validity measures used for evaluating clustering results. One of the main objectives of using these measures is to seek the optimal unknown number of clusters. Some measures work well for clusters with different densities, sizes and shapes. Yet, one of the weaknesses that those validity measures share is that they sometimes provide only one clear optimal number of clusters. That number is actually unknown and there might be more than one potential sub-optimal option that a user may wish to choose based on different applications. We develop two new cluster validity indices based on a correlation between an actual distance between a pair of data points and a centroid distance of clusters that the two points are located in. Our proposed indices constantly yield several peaks at different numbers of clusters which overcome the weakness previously stated. Furthermore, the introduced correlation can also be used for evaluating the quality of a selected clustering result. Several experiments in different scenarios, including the well-known iris data set and a real-world marketing application, have been conducted to compare the proposed validity indices with several well-known ones.Keywords: clustering algorithm, cluster validity measure, correlation, data partitions, iris data set, marketing, pattern recognition
Procedia PDF Downloads 10515881 Failure Inference and Optimization for Step Stress Model Based on Bivariate Wiener Model
Authors: Soudabeh Shemehsavar
Abstract:
In this paper, we consider the situation under a life test, in which the failure time of the test units are not related deterministically to an observable stochastic time varying covariate. In such a case, the joint distribution of failure time and a marker value would be useful for modeling the step stress life test. The problem of accelerating such an experiment is considered as the main aim of this paper. We present a step stress accelerated model based on a bivariate Wiener process with one component as the latent (unobservable) degradation process, which determines the failure times and the other as a marker process, the degradation values of which are recorded at times of failure. Parametric inference based on the proposed model is discussed and the optimization procedure for obtaining the optimal time for changing the stress level is presented. The optimization criterion is to minimize the approximate variance of the maximum likelihood estimator of a percentile of the products’ lifetime distribution.Keywords: bivariate normal, Fisher information matrix, inverse Gaussian distribution, Wiener process
Procedia PDF Downloads 32115880 The Effects of Different Parameters of Wood Floating Debris on Scour Rate Around Bridge Piers
Authors: Muhanad Al-Jubouri
Abstract:
A local scour is the most important of the several scours impacting bridge performance and security. Even though scour is widespread in bridges, especially during flood seasons, the experimental tests could not be applied to many standard highway bridges. A computational fluid dynamics numerical model was used to solve the problem of calculating local scouring and deposition for non-cohesive silt and clear water conditions near single and double cylindrical piers with the effect of floating debris. When FLOW-3D software is employed with the Rang turbulence model, the Nilsson bed-load transfer equation and fine mesh size are considered. The numerical findings of single cylindrical piers correspond pretty well with the physical model's results. Furthermore, after parameter effectiveness investigates the range of outcomes based on predicted user inputs such as the bed-load equation, mesh cell size, and turbulence model, the final numerical predictions are compared to experimental data. When the findings are compared, the error rate for the deepest point of the scour is equivalent to 3.8% for the single pier example.Keywords: local scouring, non-cohesive, clear water, computational fluid dynamics, turbulence model, bed-load equation, debris
Procedia PDF Downloads 7315879 The Role of Group Size, Public Employees’ Wages and Control Corruption Institutions in a Game-Theoretical Model of Public Corruption
Authors: Pablo J. Valverde, Jaime E. Fernandez
Abstract:
This paper shows under which conditions public corruption can emerge. The theoretical model includes variables such as the public employee wage (w), a control corruption parameter (c), and the group size of interactions (GS) between clusters of public officers and contractors. The system behavior is analyzed using phase diagrams based on combinations of such parameters (c, w, GS). Numerical simulations are implemented in order to contrast analytic results based on Nash equilibria of the theoretical model. Major findings include the functional relationship between wages and network topology, which attempts to reduce the emergence of corrupt behavior.Keywords: public corruption, game theory, complex systems, Nash equilibrium.
Procedia PDF Downloads 24615878 Evaluating the Suitability and Performance of Dynamic Modulus Predictive Models for North Dakota’s Asphalt Mixtures
Authors: Duncan Oteki, Andebut Yeneneh, Daba Gedafa, Nabil Suleiman
Abstract:
Most agencies lack the equipment required to measure the dynamic modulus (|E*|) of asphalt mixtures, necessitating the need to use predictive models. This study compared measured |E*| values for nine North Dakota asphalt mixes using the original Witczak, modified Witczak, and Hirsch models. The influence of temperature on the |E*| models was investigated, and Pavement ME simulations were conducted using measured |E*| and predictions from the most accurate |E*| model. The results revealed that the original Witczak model yielded the lowest Se/Sy and highest R² values, indicating the lowest bias and highest accuracy, while the poorest overall performance was exhibited by the Hirsch model. Using predicted |E*| as inputs in the Pavement ME generated conservative distress predictions compared to using measured |E*|. The original Witczak model was recommended for predicting |E*| for low-reliability pavements in North Dakota.Keywords: asphalt mixture, binder, dynamic modulus, MEPDG, pavement ME, performance, prediction
Procedia PDF Downloads 5315877 Efficiency of Secondary Schools by ICT Intervention in Sylhet Division of Bangladesh
Authors: Azizul Baten, Kamrul Hossain, Abdullah-Al-Zabir
Abstract:
The objective of this study is to develop an appropriate stochastic frontier secondary schools efficiency model by ICT Intervention and to examine the impact of ICT challenges on secondary schools efficiency in the Sylhet division in Bangladesh using stochastic frontier analysis. The Translog stochastic frontier model was found an appropriate than the Cobb-Douglas model in secondary schools efficiency by ICT Intervention. Based on the results of the Cobb-Douglas model, it is found that the coefficient of the number of teachers, the number of students, and teaching ability had a positive effect on increasing the level of efficiency. It indicated that these are related to technical efficiency. In the case of inefficiency effects for both Cobb-Douglas and Translog models, the coefficient of the ICT lab decreased secondary school inefficiency, but the online class in school was found to increase the level of inefficiency. The coefficients of teacher’s preference for ICT tools like multimedia projectors played a contributor role in decreasing the secondary school inefficiency in the Sylhet division of Bangladesh. The interaction effects of the number of teachers and the classrooms, and the number of students and the number of classrooms, the number of students and teaching ability, and the classrooms and teaching ability of the teachers were recorded with the positive values and these have a positive impact on increasing the secondary school efficiency. The overall mean efficiency of urban secondary schools was found at 84.66% for the Translog model, while it was 83.63% for the Cobb-Douglas model. The overall mean efficiency of rural secondary schools was found at 80.98% for the Translog model, while it was 81.24% for the Cobb-Douglas model. So, the urban secondary schools performed better than the rural secondary schools in the Sylhet division. It is observed from the results of the Tobit model that the teacher-student ratio had a positive influence on secondary school efficiency. The teaching experiences of those who have 1 to 5 years and 10 years above, MPO type school, conventional teaching method have had a negative and significant influence on secondary school efficiency. The estimated value of σ-square (0.0625) was different from Zero, indicating a good fit. The value of γ (0.9872) was recorded as positive and it can be interpreted as follows: 98.72 percent of random variation around in secondary school outcomes due to inefficiency.Keywords: efficiency, secondary schools, ICT, stochastic frontier analysis
Procedia PDF Downloads 15615876 Distangling Biological Noise in Cellular Images with a Focus on Explainability
Authors: Manik Sharma, Ganapathy Krishnamurthi
Abstract:
The cost of some drugs and medical treatments has risen in recent years, that many patients are having to go without. A classification project could make researchers more efficient. One of the more surprising reasons behind the cost is how long it takes to bring new treatments to market. Despite improvements in technology and science, research and development continues to lag. In fact, finding new treatment takes, on average, more than 10 years and costs hundreds of millions of dollars. If successful, we could dramatically improve the industry's ability to model cellular images according to their relevant biology. In turn, greatly decreasing the cost of treatments and ensure these treatments get to patients faster. This work aims at solving a part of this problem by creating a cellular image classification model which can decipher the genetic perturbations in cell (occurring naturally or artificially). Another interesting question addressed is what makes the deep-learning model decide in a particular fashion, which can further help in demystifying the mechanism of action of certain perturbations and paves a way towards the explainability of the deep-learning model.Keywords: cellular images, genetic perturbations, deep-learning, explainability
Procedia PDF Downloads 11615875 Cognitive Model of Analogy Based on Operation of the Brain Cells: Glial, Axons and Neurons
Authors: Ozgu Hafizoglu
Abstract:
Analogy is an essential tool of human cognition that enables connecting diffuse and diverse systems with attributional, deep structural, casual relations that are essential to learning, to innovation in artificial worlds, and to discovery in science. Cognitive Model of Analogy (CMA) leads and creates information pattern transfer within and between domains and disciplines in science. This paper demonstrates the Cognitive Model of Analogy (CMA) as an evolutionary approach to scientific research. The model puts forward the challenges of deep uncertainty about the future, emphasizing the need for flexibility of the system in order to enable reasoning methodology to adapt to changing conditions. In this paper, the model of analogical reasoning is created based on brain cells, their fractal, and operational forms within the system itself. Visualization techniques are used to show correspondences. Distinct phases of the problem-solving processes are divided thusly: encoding, mapping, inference, and response. The system is revealed relevant to brain activation considering each of these phases with an emphasis on achieving a better visualization of the brain cells: glial cells, axons, axon terminals, and neurons, relative to matching conditions of analogical reasoning and relational information. It’s found that encoding, mapping, inference, and response processes in four-term analogical reasoning are corresponding with the fractal and operational forms of brain cells: glial, axons, and neurons.Keywords: analogy, analogical reasoning, cognitive model, brain and glials
Procedia PDF Downloads 19115874 Testing the Moderating Effect of Sub Ethnic on Household Investment Behaviour
Authors: Widayat Widayat
Abstract:
Nowday, in the modern investment era, household behavior on investment is a topic that is quite warm. The development of the modern investment, indicated by the emergence of a variety of investment instruments, such as stocks, bonds and various forms of derivatives, affected on the complexity of choosing an investment, especially for traditional societies. Various studies show that there is more than one factor acting as a behavioral antesenden decide to choose an investment instrument. One of the factors, which contribute in determining the investment option is ethnic. Society with a particular sub-culture tend to prefer investing their particular instrument. This is because they have the values, norms and different social environmental. This article is designed to test the impact of sub-cultures between Osing-Java as moderator, in investing. The study was conducted in Banyuwangi, East Java Province of Indonesia. Data were collected using questionnaires, which is given to the head of the household respondents were selected as samples. Sample of households selected by multistage sampling method. The data have been collected processed using SmartPLS software and testing moderating effects using grouped sample test. The result showed that sub-ethnic and has a significant role in determining the investment.Keywords: investment behaviour, household, moderating, sub ethnic
Procedia PDF Downloads 37315873 Role of Zakat and Awqf in Socioeconomic Development of Pakistan: Exploring the Issues and Challenges
Authors: Marium. K.Makhdoom, Talat Hussain, Syed H. Bukhari
Abstract:
The motivation behind this paper is to focus the need of Zakat as a monetary framework with a specific end goal and as a social equity instrument and minimization of the level of poverty in society to assess the socioeconomic development. The procedure of the study includes investigating the applied system of Islamic economics to propose an option display so as to contribute fundamentally to the Ummah and serving the countries. This paper closes to be viewed Zakat as one of the best possible strategies to quantify the socioeconomic development, which implies when individuals pay Zakat the socioeconomic development level will be higher and vice versa. The duties of Muslims to pay Zakat to accomplish practical improvement as far as wealth redistribution in the middle of Muslims and in addition overcoming any and all hardships between the rich and the poor in the general public. The paper adds to consider Zakat as an index to gauge economic development, moreover, the part of Zakat as an instrument of social equity and neediness destruction in the public eye. By and large, this includes the installment every year of more than two percent of one's capital after the needs of the family have been met.Keywords: Zakat, Waqf, economic development, Pakistan, Islamic economics, macroeconomics, microeconomics
Procedia PDF Downloads 43415872 Efficient Model Selection in Linear and Non-Linear Quantile Regression by Cross-Validation
Authors: Yoonsuh Jung, Steven N. MacEachern
Abstract:
Check loss function is used to define quantile regression. In the prospect of cross validation, it is also employed as a validation function when underlying truth is unknown. However, our empirical study indicates that the validation with check loss often leads to choosing an over estimated fits. In this work, we suggest a modified or L2-adjusted check loss which rounds the sharp corner in the middle of check loss. It has a large effect of guarding against over fitted model in some extent. Through various simulation settings of linear and non-linear regressions, the improvement of check loss by L2 adjustment is empirically examined. This adjustment is devised to shrink to zero as sample size grows.Keywords: cross-validation, model selection, quantile regression, tuning parameter selection
Procedia PDF Downloads 43915871 Aiming at Optimization of Tracking Technology through Seasonally Tilted Sun Trackers: An Indian Perspective
Authors: Sanjoy Mukherjee
Abstract:
Discussions on concepts of Single Axis Tracker (SAT) are becoming more and more apt for developing countries like India not just as an advancement in racking technology but due to the utmost necessity of reaching at the lowest Levelized Cost of Energy (LCOE) targets. With this increasing competition and significant fall in feed-in tariffs of solar PV projects, developers are under constant pressure to secure investment for their projects and eventually earn profits from them. Moreover, being the second largest populated country, India suffers from scarcity of land because of higher average population density. So, to mitigate the risk of this dual edged sword with reducing trend of unit (kWh) cost at one side and utilization of land on the other, tracking evolved as the call of the hour. Therefore, the prime objectives of this paper are not only to showcase how STT proves to be an effective mechanism to get more gain in Global Incidence in collector plane (Ginc) with respect to traditional mounting systems but also to introduce Seasonally Tilted Tracker (STT) technology as a possible option for high latitude locations.Keywords: tracking system, grid connected solar PV plant, CAPEX reduction, levelized cost of energy
Procedia PDF Downloads 26115870 Uncertainty in Risk Modeling
Authors: Mueller Jann, Hoffmann Christian Hugo
Abstract:
Conventional quantitative risk management in banking is a risk factor of its own, because it rests on assumptions such as independence and availability of data which do not hold when rare events of extreme consequences are involved. There is a growing recognition of the need for alternative risk measures that do not make these assumptions. We propose a novel method for modeling the risk associated with investment products, in particular derivatives, by using a formal language for specifying financial contracts. Expressions in this language are interpreted in the category of values annotated with (a formal representation of) uncertainty. The choice of uncertainty formalism thus becomes a parameter of the model, so it can be adapted to the particular application and it is not constrained to classical probabilities. We demonstrate our approach using a simple logic-based uncertainty model and a case study in which we assess the risk of counter party default in a portfolio of collateralized loans.Keywords: risk model, uncertainty monad, derivatives, contract algebra
Procedia PDF Downloads 58015869 Comparison Analysis of CFD Turbulence Fluid Numerical Study for Quick Coupling
Authors: JoonHo Lee, KyoJin An, JunSu Kim, Young-Chul Park
Abstract:
In this study, the fluid flow characteristics and performance numerical study through CFD model of the Non-split quick coupling for flow control in hydraulic system equipment for the aerospace business group focused to predict. In this study, we considered turbulence models for the application of Computational Fluid Dynamics for the CFD model of the Non-split Quick Coupling for aerospace business. In addition to this, the adequacy of the CFD model were verified by comparing with standard value. Based on this analysis, accurate the fluid flow characteristics can be predicted. It is, therefore, the design of the fluid flow characteristic contribute the reliability for the Quick Coupling which is required in industries on the basis of research results.Keywords: CFD, FEM, quick coupling, turbulence
Procedia PDF Downloads 38915868 Deepfake Detection for Compressed Media
Authors: Sushil Kumar Gupta, Atharva Joshi, Ayush Sonawale, Sachin Naik, Rajshree Khande
Abstract:
The usage of artificially created videos and audio by deep learning is a major problem of the current media landscape, as it pursues the goal of misinformation and distrust. In conclusion, the objective of this work targets generating a reliable deepfake detection model using deep learning that will help detect forged videos accurately. In this work, CelebDF v1, one of the largest deepfake benchmark datasets in the literature, is adopted to train and test the proposed models. The data includes authentic and synthetic videos of high quality, therefore allowing an assessment of the model’s performance against realistic distortions.Keywords: deepfake detection, CelebDF v1, convolutional neural network (CNN), xception model, data augmentation, media manipulation
Procedia PDF Downloads 1515867 Designing a Model for Preparing Reports on the Automatic Earned Value Management Progress by the Integration of Primavera P6, SQL Database, and Power BI: A Case Study of a Six-Storey Concrete Building in Mashhad, Iran
Authors: Hamed Zolfaghari, Mojtaba Kord
Abstract:
Project planners and controllers are frequently faced with the challenge of inadequate software for the preparation of automatic project progress reports based on actual project information updates. They usually make dashboards in Microsoft Excel, which is local and not applicable online. Another shortcoming is that it is not linked to planning software such as Microsoft Project, which lacks the database required for data storage. This study aimed to propose a model for the preparation of reports on automatic online project progress based on actual project information updates by the integration of Primavera P6, SQL database, and Power BI for a construction project. The designed model could be applicable to project planners and controller agents by enabling them to prepare project reports automatically and immediately after updating the project schedule using actual information. To develop the model, the data were entered into P6, and the information was stored on the SQL database. The proposed model could prepare a wide range of reports, such as earned value management, HR reports, and financial, physical, and risk reports automatically on the Power BI application. Furthermore, the reports could be published and shared online.Keywords: primavera P6, SQL, Power BI, EVM, integration management
Procedia PDF Downloads 11215866 Artificial Neural Network Based Parameter Prediction of Miniaturized Solid Rocket Motor
Authors: Hao Yan, Xiaobing Zhang
Abstract:
The working mechanism of miniaturized solid rocket motors (SRMs) is not yet fully understood. It is imperative to explore its unique features. However, there are many disadvantages to using common multi-objective evolutionary algorithms (MOEAs) in predicting the parameters of the miniaturized SRM during its conceptual design phase. Initially, the design variables and objectives are constrained in a lumped parameter model (LPM) of this SRM, which leads to local optima in MOEAs. In addition, MOEAs require a large number of calculations due to their population strategy. Although the calculation time for simulating an LPM just once is usually less than that of a CFD simulation, the number of function evaluations (NFEs) is usually large in MOEAs, which makes the total time cost unacceptably long. Moreover, the accuracy of the LPM is relatively low compared to that of a CFD model due to its assumptions. CFD simulations or experiments are required for comparison and verification of the optimal results obtained by MOEAs with an LPM. The conceptual design phase based on MOEAs is a lengthy process, and its results are not precise enough due to the above shortcomings. An artificial neural network (ANN) based parameter prediction is proposed as a way to reduce time costs and improve prediction accuracy. In this method, an ANN is used to build a surrogate model that is trained with a 3D numerical simulation. In design, the original LPM is replaced by a surrogate model. Each case uses the same MOEAs, in which the calculation time of the two models is compared, and their optimization results are compared with 3D simulation results. Using the surrogate model for the parameter prediction process of the miniaturized SRMs results in a significant increase in computational efficiency and an improvement in prediction accuracy. Thus, the ANN-based surrogate model does provide faster and more accurate parameter prediction for an initial design scheme. Moreover, even when the MOEAs converge to local optima, the time cost of the ANN-based surrogate model is much lower than that of the simplified physical model LPM. This means that designers can save a lot of time during code debugging and parameter tuning in a complex design process. Designers can reduce repeated calculation costs and obtain accurate optimal solutions by combining an ANN-based surrogate model with MOEAs.Keywords: artificial neural network, solid rocket motor, multi-objective evolutionary algorithm, surrogate model
Procedia PDF Downloads 9315865 Verification of a Simple Model for Rolling Isolation System Response
Authors: Aarthi Sridhar, Henri Gavin, Karah Kelly
Abstract:
Rolling Isolation Systems (RISs) are simple and effective means to mitigate earthquake hazards to equipment in critical and precious facilities, such as hospitals, network collocation facilities, supercomputer centers, and museums. The RIS works by isolating components acceleration the inertial forces felt by the subsystem. The RIS consists of two platforms with counter-facing concave surfaces (dishes) in each corner. Steel balls lie inside the dishes and allow the relative motion between the top and bottom platform. Formerly, a mathematical model for the dynamics of RISs was developed using Lagrange’s equations (LE) and experimentally validated. A new mathematical model was developed using Gauss’s Principle of Least Constraint (GPLC) and verified by comparing impulse response trajectories of the GPLC model and the LE model in terms of the peak displacements and accelerations of the top platform. Mathematical models for the RIS are tedious to derive because of the non-holonomic rolling constraints imposed on the system. However, using Gauss’s Principle of Least constraint to find the equations of motion removes some of the obscurity and yields a system that can be easily extended. Though the GPLC model requires more state variables, the equations of motion are far simpler. The non-holonomic constraint is enforced in terms of accelerations and therefore requires additional constraint stabilization methods in order to avoid the possibility that numerical integration methods can cause the system to go unstable. The GPLC model allows the incorporation of more physical aspects related to the RIS, such as contribution of the vertical velocity of the platform to the kinetic energy and the mass of the balls. This mathematical model for the RIS is a tool to predict the motion of the isolation platform. The ability to statistically quantify the expected responses of the RIS is critical in the implementation of earthquake hazard mitigation.Keywords: earthquake hazard mitigation, earthquake isolation, Gauss’s Principle of Least Constraint, nonlinear dynamics, rolling isolation system
Procedia PDF Downloads 25515864 Assessment of Modern RANS Models for the C3X Vane Film Cooling Prediction
Authors: Mikhail Gritskevich, Sebastian Hohenstein
Abstract:
The paper presents the results of a detailed assessment of several modern Reynolds Averaged Navier-Stokes (RANS) turbulence models for prediction of C3X vane film cooling at various injection regimes. Three models are considered, namely the Shear Stress Transport (SST) model, the modification of the SST model accounting for the streamlines curvature (SST-CC), and the Explicit Algebraic Reynolds Stress Model (EARSM). It is shown that all the considered models face with a problem in prediction of the adiabatic effectiveness in the vicinity of the cooling holes; however, accounting for the Reynolds stress anisotropy within the EARSM model noticeably increases the solution accuracy. On the other hand, further downstream all the models provide a reasonable agreement with the experimental data for the adiabatic effectiveness and among the considered models the most accurate results are obtained with the use EARMS.Keywords: discrete holes film cooling, Reynolds Averaged Navier-Stokes (RANS), Reynolds stress tensor anisotropy, turbulent heat transfer
Procedia PDF Downloads 42215863 Islamic Finance: What is the Outlook for Italy?
Authors: Paolo Pietro Biancone
Abstract:
The spread of Islamic financial instruments is an opportunity to offer integration for the immigrant population and to attract, through the specific products, the richness of sovereign funds from the "Arab" countries. However, it is important to consider the possibility of comparing a traditional finance model, which in recent times has given rise to many doubts, with an "alternative" finance model, where the ethical aspect arising from religious principles is very important.Keywords: banks, Europe, Islamic finance, Italy
Procedia PDF Downloads 27615862 Techno-Economic Assessment of Distributed Heat Pumps Integration within a Swedish Neighborhood: A Cosimulation Approach
Authors: Monica Arnaudo, Monika Topel, Bjorn Laumert
Abstract:
Within the Swedish context, the current trend of relatively low electricity prices promotes the electrification of the energy infrastructure. The residential heating sector takes part in this transition by proposing a switch from a centralized district heating system towards a distributed heat pumps-based setting. When it comes to urban environments, two issues arise. The first, seen from an electricity-sector perspective, is related to the fact that existing networks are limited with regards to their installed capacities. Additional electric loads, such as heat pumps, can cause severe overloads on crucial network elements. The second, seen from a heating-sector perspective, has to do with the fact that the indoor comfort conditions can become difficult to handle when the operation of the heat pumps is limited by a risk of overloading on the distribution grid. Furthermore, the uncertainty of the electricity market prices in the future introduces an additional variable. This study aims at assessing the extent to which distributed heat pumps can penetrate an existing heat energy network while respecting the technical limitations of the electricity grid and the thermal comfort levels in the buildings. In order to account for the multi-disciplinary nature of this research question, a cosimulation modeling approach was adopted. In this way, each energy technology is modeled in its customized simulation environment. As part of the cosimulation methodology: a steady-state power flow analysis in pandapower was used for modeling the electrical distribution grid, a thermal balance model of a reference building was implemented in EnergyPlus to account for space heating and a fluid-cycle model of a heat pump was implemented in JModelica to account for the actual heating technology. With the models set in place, different scenarios based on forecasted electricity market prices were developed both for present and future conditions of Hammarby Sjöstad, a neighborhood located in the south-east of Stockholm (Sweden). For each scenario, the technical and the comfort conditions were assessed. Additionally, the average cost of heat generation was estimated in terms of levelized cost of heat. This indicator enables a techno-economic comparison study among the different scenarios. In order to evaluate the levelized cost of heat, a yearly performance simulation of the energy infrastructure was implemented. The scenarios related to the current electricity prices show that distributed heat pumps can replace the district heating system by covering up to 30% of the heating demand. By lowering of 2°C, the minimum accepted indoor temperature of the apartments, this level of penetration can increase up to 40%. Within the future scenarios, if the electricity prices will increase, as most likely expected within the next decade, the penetration of distributed heat pumps can be limited to 15%. In terms of levelized cost of heat, a residential heat pump technology becomes competitive only within a scenario of decreasing electricity prices. In this case, a district heating system is characterized by an average cost of heat generation 7% higher compared to a distributed heat pumps option.Keywords: cosimulation, distributed heat pumps, district heating, electrical distribution grid, integrated energy systems
Procedia PDF Downloads 15515861 The BL-5D Model: The Development of a Model of Instructional Design for Blended Learning Activities
Authors: Damian Gordon, Paul Doyle, Anna Becevel, Júlia Vilafranca Molero, Cinta Gascon, Arianna Vitiello, Tina Baloh
Abstract:
It has long been recognized that the creation of any teaching content can be enhanced if the development process follows a pre-defined approach, which is often referred to as an instructional design methodology. These methodologies typically define a number of stages, or phases, that an educator should undertake to help ensure the quality of the final teaching content that is developed. In this paper, we present an instructional design methodology that is focused specifically on the introduction of blended resources into a heretofore bricks-and-mortar course. To achieve this, research was undertaken concerning a range of models of instructional design, as well as literature covering some of the key challenges and “pain points” of blending. Following this, our model, the BL-5D model, is presented, which incorporates some key questions at each stage of this five-stage methodology to guide the development process. Finally, a discussion of some of the key themes and issues that have been uncovered in this work is presented, as well as a template for a blended learning case study that emerged from this approach.Keywords: blended learning, challenges of blended learning, design methodologies, instructional design
Procedia PDF Downloads 12715860 Numerical Simulation of a Three-Dimensional Framework under the Action of Two-Dimensional Moving Loads
Authors: Jia-Jang Wu
Abstract:
The objective of this research is to develop a general technique so that one may predict the dynamic behaviour of a three-dimensional scale crane model subjected to time-dependent moving point forces by means of conventional finite element computer packages. To this end, the whole scale crane model is divided into two parts: the stationary framework and the moving substructure. In such a case, the dynamic responses of a scale crane model can be predicted from the forced vibration responses of the stationary framework due to actions of the four time-dependent moving point forces induced by the moving substructure. Since the magnitudes and positions of the moving point forces are dependent on the relative positions between the trolley, moving substructure and the stationary framework, it can be found from the numerical results that the time histories for the moving speeds of the moving substructure and the trolley are the key factors affecting the dynamic responses of the scale crane model.Keywords: moving load, moving substructure, dynamic responses, forced vibration responses
Procedia PDF Downloads 35515859 Social Collaborative Learning Model Based on Proactive Involvement to Promote the Global Merit Principle in Cultivating Youths' Morality
Authors: Wera Supa, Panita Wannapiroon
Abstract:
This paper is a report on the designing of the social collaborative learning model based on proactive involvement to Promote the global merit principle in cultivating youths’ morality. The research procedures into two phases, the first phase is to design the social collaborative learning model based on proactive involvement to promote the global merit principle in cultivating youths’ morality, and the second is to evaluate the social collaborative learning model based on proactive involvement. The sample group in this study consists of 15 experts who are dominant in proactive participation, moral merit principle and youths’ morality cultivation from executive level, lecturers and the professionals in information and communication technology expertise selected using the purposive sampling method. Data analyzed by arithmetic mean and standard deviation. This study has explored that there are four significant factors in promoting the hands-on collaboration of global merit scheme in order to implant virtues to adolescences which are: 1) information and communication Technology Usage; 2) proactive involvement; 3) morality cultivation policy, and 4) global merit principle. The experts agree that the social collaborative learning model based on proactive involvement is highly appropriate.Keywords: social collaborative learning, proactive involvement, global merit principle, morality
Procedia PDF Downloads 390