Search results for: ensemble model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16432

Search results for: ensemble model

15592 Students’ learning Effects in Physical Education between Sport Education Model with TPSR and Traditional Teaching Model with TPSR

Authors: Yi-Hsiang Pan, Chen-Hui Huang, Ching-Hsiang Chen, Wei-Ting Hsu

Abstract:

The purposes of the study were to explore the students' learning effect of physical education curriculum between merging Teaching Personal and Social Responsibility (TPSR) with sport education model and TPSR with traditional teaching model, which these learning effects included sport self-efficacy, sport enthusiastic, group cohesion, responsibility and game performance. The participants include 3 high school physical education teachers and 6 physical education classes, 133 participants with experience group 75 students and control group 58 students, and each teacher taught an experimental group and a control group for 16 weeks. The research methods used questionnaire investigation, interview, focus group meeting. The research instruments included personal and social responsibility questionnaire, sport enthusiastic scale, group cohesion scale, sport self-efficacy scale and game performance assessment instrument. Multivariate Analysis of covariance and Repeated measure ANOVA were used to test difference of students' learning effects between merging TPSR with sport education model and TPSR with traditional teaching model. The findings of research were: 1) The sport education model with TPSR could improve students' learning effects, including sport self-efficacy, game performance, sport enthusiastic, group cohesion and responsibility. 2) The traditional teaching model with TPSR could improve students' learning effect, including sport self-efficacy, responsibility and game performance. 3) the sport education model with TPSR could improve more learning effects than traditional teaching model with TPSR, including sport self-efficacy, sport enthusiastic,responsibility and game performance. 4) Based on qualitative data about learning experience of teachers and students, sport education model with TPSR significant improve learning motivation, group interaction and game sense. The conclusions indicated sport education model with TPSR could improve more learning effects in physical education curriculum. On other hand, the curricular projects of hybrid TPSR-Sport Education model and TPSR-Traditional Teaching model are both good curricular projects of moral character education, which may be applied in school physical education.

Keywords: character education, sport season, game performance, sport competence

Procedia PDF Downloads 420
15591 Simulation of the Large Hadrons Collisions Using Monte Carlo Tools

Authors: E. Al Daoud

Abstract:

In many cases, theoretical treatments are available for models for which there is no perfect physical realization. In this situation, the only possible test for an approximate theoretical solution is to compare with data generated from a computer simulation. In this paper, Monte Carlo tools are used to study and compare the elementary particles models. All the experiments are implemented using 10000 events, and the simulated energy is 13 TeV. The mean and the curves of several variables are calculated for each model using MadAnalysis 5. Anomalies in the results can be seen in the muons masses of the minimal supersymmetric standard model and the two Higgs doublet model.

Keywords: Feynman rules, hadrons, Lagrangian, Monte Carlo, simulation

Procedia PDF Downloads 293
15590 A Hybrid Particle Swarm Optimization-Nelder- Mead Algorithm (PSO-NM) for Nelson-Siegel- Svensson Calibration

Authors: Sofia Ayouche, Rachid Ellaia, Rajae Aboulaich

Abstract:

Today, insurers may use the yield curve as an indicator evaluation of the profit or the performance of their portfolios; therefore, they modeled it by one class of model that has the ability to fit and forecast the future term structure of interest rates. This class of model is the Nelson-Siegel-Svensson model. Unfortunately, many authors have reported a lot of difficulties when they want to calibrate the model because the optimization problem is not convex and has multiple local optima. In this context, we implement a hybrid Particle Swarm optimization and Nelder Mead algorithm in order to minimize by least squares method, the difference between the zero-coupon curve and the NSS curve.

Keywords: optimization, zero-coupon curve, Nelson-Siegel-Svensson, particle swarm optimization, Nelder-Mead algorithm

Procedia PDF Downloads 409
15589 Comparison of Volume of Fluid Model: Experimental and Empirical Results for Flows over Stacked Drop Manholes

Authors: Ramin Mansouri

Abstract:

The manhole is one of the types of structures that are installed at the site of change direction or change in the pipe diameter or sewage pipes as well as in step slope areas to reduce the flow velocity. In this study, the flow characteristics of hydraulic structures in a manhole structure have been investigated with a numerical model. In this research, the types of computational grid coarse, medium, and fines have been used for simulation. In order to simulate flow, k-ε model (standard, RNG, Realizable) and k-w model (standard SST) are used. Also, in order to find the best wall conditions, two types of standard and non-equilibrium wall functions were investigated. The turbulent model k-ε has the highest correlation with experimental results or all models. In terms of boundary conditions, constant speed is set for the flow input boundary, the output pressure is set in the boundaries which are in contact with the air, and the standard wall function is used for the effect of the wall function. In the numerical model, the depth at the output of the second manhole is estimated to be less than that of the laboratory and the output jet from the span. In the second regime, the jet flow collides with the manhole wall and divides into two parts, so hydraulic characteristics are the same as large vertical shaft hydraulic characteristics. In this situation, the turbulence is in a high range since it can be seen more energy loss in it. According to the results, energy loss in numerical is estimated at 9.359%, which is more than experimental data.

Keywords: manhole, energy, depreciation, turbulence model, wall function, flow

Procedia PDF Downloads 50
15588 Static Properties of Ge and Sr Isotopes in the Cluster Model

Authors: Mohammad Reza Shojaei, Mahdeih Mirzaeinia

Abstract:

We have studied the cluster structure of even-even stable isotopes of Ge and Sr. The Schrodinger equation has been solved using the generalized parametric Nikiforov-Uvarov method with a phenomenological potential. This potential is the sum of the attractive Yukawa-like potential, a Manning-Rosen-type potential, and the repulsive Yukawa potential for interaction between the cluster and the core. We have shown that the available experimental data of the first rotational band energies can be well described by assuming a binary system of the α cluster and the core and using an analytical solution. Our results were consistent with experimental values. Hence, this model can be applied to study the other even-even isotopes

Keywords: cluser model, NU method, ge and Sr, potential central

Procedia PDF Downloads 46
15587 A Collaborative Action Research on the Teaching of Music Learning Center in Taiwan's Preschool

Authors: Mei-Ying Liao, Lee-Ching Wei, Jung-Hsiang Tseng

Abstract:

The main purpose of this study was to explore the process of planning and execution of the music learning center in preschool. This study was conducted through a collaborative action research method. The research members included a university music professor, a teaching guide, a preschool director, and a preschool teacher, leading a class of 5-6-year-old children to participate in this study. Five teaching cycles were performed with a subject of bird. In the whole process that lasted three months, the research members would maintain the conversation, reflection, and revision repeatedly. A triangular validated method was used to collect data, including archives, interviews, seminars, observations, journals, and learning evaluations to improve research on the validity and reliability. It was found that a successful music learning center required comprehensive planning and execution. It is also important to develop good listening, singing, respect, and homing habits at the beginning of running the music learning center. By timely providing diverse musical instruments, learning materials, and activities according to the teaching goals, children’s desire to learning was highly stimulated. Besides, peer interactions improved their ensemble and problem-solving abilities. The collaborative action research enhanced the preschool teacher’s confidence and promoted professional growth of the research members.

Keywords: collaborative action research, case study, music learning center, music development

Procedia PDF Downloads 353
15586 Conceptual Model for Logistics Information System

Authors: Ana María Rojas Chaparro, Cristian Camilo Sarmiento Chaves

Abstract:

Given the growing importance of logistics as a discipline for efficient management of materials flow and information, the adoption of tools that permit to create facilities in making decisions based on a global perspective of the system studied has been essential. The article shows how from a concepts-based model is possible to organize and represent in appropriate way the reality, showing accurate and timely information, features that make this kind of models an ideal component to support an information system, recognizing that information as relevant to establish particularities that allow get a better performance about the evaluated sector.

Keywords: system, information, conceptual model, logistics

Procedia PDF Downloads 468
15585 Automatic Flood Prediction Using Rainfall Runoff Model in Moravian-Silesian Region

Authors: B. Sir, M. Podhoranyi, S. Kuchar, T. Kocyan

Abstract:

Rainfall-runoff models play important role in hydrological predictions. However, the model is only one part of the process for creation of flood prediction. The aim of this paper is to show the process of successful prediction for flood event (May 15–May 18 2014). The prediction was performed by rainfall runoff model HEC–HMS, one of the models computed within Floreon+ system. The paper briefly evaluates the results of automatic hydrologic prediction on the river Olše catchment and its gages Český Těšín and Věřňovice.

Keywords: flood, HEC-HMS, prediction, rainfall, runoff

Procedia PDF Downloads 368
15584 Dynamic Gabor Filter Facial Features-Based Recognition of Emotion in Video Sequences

Authors: T. Hari Prasath, P. Ithaya Rani

Abstract:

In the world of visual technology, recognizing emotions from the face images is a challenging task. Several related methods have not utilized the dynamic facial features effectively for high performance. This paper proposes a method for emotions recognition using dynamic facial features with high performance. Initially, local features are captured by Gabor filter with different scale and orientations in each frame for finding the position and scale of face part from different backgrounds. The Gabor features are sent to the ensemble classifier for detecting Gabor facial features. The region of dynamic features is captured from the Gabor facial features in the consecutive frames which represent the dynamic variations of facial appearances. In each region of dynamic features is normalized using Z-score normalization method which is further encoded into binary pattern features with the help of threshold values. The binary features are passed to Multi-class AdaBoost classifier algorithm with the well-trained database contain happiness, sadness, surprise, fear, anger, disgust, and neutral expressions to classify the discriminative dynamic features for emotions recognition. The developed method is deployed on the Ryerson Multimedia Research Lab and Cohn-Kanade databases and they show significant performance improvement owing to their dynamic features when compared with the existing methods.

Keywords: detecting face, Gabor filter, multi-class AdaBoost classifier, Z-score normalization

Procedia PDF Downloads 250
15583 Implementation of IWA-ASM1 Model for Simulating the Wastewater Treatment Plant of Beja by GPS-X 5.1

Authors: Fezzani Boubaker

Abstract:

The modified activated sludge model (ASM1 or Mantis) is a generic structured model and a common platform for dynamic simulation of varieties of aerobic processes for optimization and upgrading of existing plants and for new facilities design. In this study, the modified ASM1 included in the GPS-X software was used to simulate the wastewater treatment plant (WWTP) of Beja treating domestic sewage mixed with baker‘s yeast factory effluent. The results of daily measurements and operating records were used to calibrate the model. A sensitivity and an automatic optimization analysis were conducted to determine the most sensitive and optimal parameters. The results indicated that the ASM1 model could simulate with good accuracy: the COD concentration of effluents from the WWTP of Beja for all months of the year 2012. In addition, it prevents the disruption observed at the output of the plant by injecting the baker‘s yeast factory effluent at high concentrations varied between 20 and 80 g/l.

Keywords: ASM1, activated sludge, baker’s yeast effluent, modelling, simulation, GPS-X 5.1 software

Procedia PDF Downloads 322
15582 Multivariate Analysis on Water Quality Attributes Using Master-Slave Neural Network Model

Authors: A. Clementking, C. Jothi Venkateswaran

Abstract:

Mathematical and computational functionalities such as descriptive mining, optimization, and predictions are espoused to resolve natural resource planning. The water quality prediction and its attributes influence determinations are adopted optimization techniques. The water properties are tainted while merging water resource one with another. This work aimed to predict influencing water resource distribution connectivity in accordance to water quality and sediment using an innovative proposed master-slave neural network back-propagation model. The experiment results are arrived through collecting water quality attributes, computation of water quality index, design and development of neural network model to determine water quality and sediment, master–slave back propagation neural network back-propagation model to determine variations on water quality and sediment attributes between the water resources and the recommendation for connectivity. The homogeneous and parallel biochemical reactions are influences water quality and sediment while distributing water from one location to another. Therefore, an innovative master-slave neural network model [M (9:9:2)::S(9:9:2)] designed and developed to predict the attribute variations. The result of training dataset given as an input to master model and its maximum weights are assigned as an input to the slave model to predict the water quality. The developed master-slave model is predicted physicochemical attributes weight variations for 85 % to 90% of water quality as a target values.The sediment level variations also predicated from 0.01 to 0.05% of each water quality percentage. The model produced the significant variations on physiochemical attribute weights. According to the predicated experimental weight variation on training data set, effective recommendations are made to connect different resources.

Keywords: master-slave back propagation neural network model(MSBPNNM), water quality analysis, multivariate analysis, environmental mining

Procedia PDF Downloads 447
15581 The Future of Insurance: P2P Innovation versus Traditional Business Model

Authors: Ivan Sosa Gomez

Abstract:

Digitalization has impacted the entire insurance value chain, and the growing movement towards P2P platforms and the collaborative economy is also beginning to have a significant impact. P2P insurance is defined as innovation, enabling policyholders to pool their capital, self-organize, and self-manage their own insurance. In this context, new InsurTech start-ups are emerging as peer-to-peer (P2P) providers, based on a model that differs from traditional insurance. As a result, although P2P platforms do not change the fundamental basis of insurance, they do enable potentially more efficient business models to be established in terms of ensuring the coverage of risk. It is therefore relevant to determine whether p2p innovation can have substantial effects on the future of the insurance sector. For this purpose, it is considered necessary to develop P2P innovation from a business perspective, as well as to build a comparison between a traditional model and a P2P model from an actuarial perspective. Objectives: The objectives are (1) to represent P2P innovation in the business model compared to the traditional insurance model and (2) to establish a comparison between a traditional model and a P2P model from an actuarial perspective. Methodology: The research design is defined as action research in terms of understanding and solving the problems of a collectivity linked to an environment, applying theory and best practices according to the approach. For this purpose, the study is carried out through the participatory variant, which involves the collaboration of the participants, given that in this design, participants are considered experts. For this purpose, prolonged immersion in the field is carried out as the main instrument for data collection. Finally, an actuarial model is developed relating to the calculation of premiums that allows for the establishment of projections of future scenarios and the generation of conclusions between the two models. Main Contributions: From an actuarial and business perspective, we aim to contribute by developing a comparison of the two models in the coverage of risk in order to determine whether P2P innovation can have substantial effects on the future of the insurance sector.

Keywords: Insurtech, innovation, business model, P2P, insurance

Procedia PDF Downloads 68
15580 Modelling Enablers of Service Using ISM: Implications for Quality Improvements in Healthcare Sector of UAE

Authors: Flevy Lasrado

Abstract:

Purpose: The purpose of this paper is to show the relationship between the service quality dimensions and model them to propose quality improvements using interpretive structural modelling (ISM). Methodology: This paper used an interpretive structural modelling (ISM). The data was collected from the expert opinions that included a questionnaire. The detailed method of using ISM is discussed in the paper. Findings: The present research work provides an ISM based model to understand the relationships among the service quality dimensions. Practical implications or Original Value: An ISM based model has been developed for healthcare facility for improving customer satisfaction and increasing market share. Although there is lot of research on SERVQUAL model adapted to healthcare sector, no study has been done to understand the interactions among these dimensions. So the major contribution of this research work is the development of contextual relationships among identified variables through a systematic framework. The present research work provides an ISM based model to understand the relationships among the service quality dimensions.

Keywords: SERQUAL, healthcare, quality, service quality

Procedia PDF Downloads 378
15579 Predicting Financial Distress in South Africa

Authors: Nikki Berrange, Gizelle Willows

Abstract:

Business rescue has become increasingly popular since its inclusion in the Companies Act of South Africa in May 2011. The Alternate Exchange (AltX) of the Johannesburg Stock Exchange has experienced a marked increase in the number of companies entering business rescue. This study sampled twenty companies listed on the AltX to determine whether Altman’s Z-score model for emerging markets (ZEM) or Taffler’s Z-score model is a more accurate model in predicting financial distress for small to medium size companies in South Africa. The study was performed over three different time horizons; one, two and three years prior to the event of financial distress, in order to determine how many companies each model predicted would be unlikely to succeed as well as the predictive ability and accuracy of the respective models. The study found that Taffler’s Z-score model had a greater ability at predicting financial distress from all three-time horizons.

Keywords: Altman’s ZEM-score, Altman’s Z-score, AltX, business rescue, Taffler’s Z-score

Procedia PDF Downloads 332
15578 Normalizing Logarithms of Realized Volatility in an ARFIMA Model

Authors: G. L. C. Yap

Abstract:

Modelling realized volatility with high-frequency returns is popular as it is an unbiased and efficient estimator of return volatility. A computationally simple model is fitting the logarithms of the realized volatilities with a fractionally integrated long-memory Gaussian process. The Gaussianity assumption simplifies the parameter estimation using the Whittle approximation. Nonetheless, this assumption may not be met in the finite samples and there may be a need to normalize the financial series. Based on the empirical indices S&P500 and DAX, this paper examines the performance of the linear volatility model pre-treated with normalization compared to its existing counterpart. The empirical results show that by including normalization as a pre-treatment procedure, the forecast performance outperforms the existing model in terms of statistical and economic evaluations.

Keywords: Gaussian process, long-memory, normalization, value-at-risk, volatility, Whittle estimator

Procedia PDF Downloads 332
15577 Interconnected Market Hypothesis: A Conceptual Model of Individualistic, Information-Based Interconnectedness

Authors: James Kinsella

Abstract:

There is currently very little understanding of how the interaction between in- vestors, consumers, the firms (agents) affect a) the transmission of information, and b) the creation and transfer of value and wealth between these two groups. Employing scholarly ideas from multiple research areas (behavioural finance, emotional finance, econo-biology, and game theory) we develop a conceptual the- oretic model (the ‘bow-tie’ model) as a framework for considering this interaction. Our bow-tie model views information transfer, value and wealth creation, and transfer through the lens of “investor-consumer connection facilitated through the communicative medium of the ‘firm’ (agents)”. We confront our bow-tie model with theoretical and practical examples. Next, we utilise consumer and business confidence data alongside index data, to conduct quantitative analy- sis, to support our bow-tie concept, and to introduce the concept of “investor- consumer connection”. We highlight the importance of information persuasiveness, knowledge, and emotional categorization of characteristics in facilitating a communicative relationship between investors, consumers, and the firm (agents), forming academic and practical applications of the conceptual bow-tie model, alongside applications to wider instances, such as those seen within the Covid-19 pandemic.

Keywords: behavioral finance, emotional finance, economy-biology, social mood

Procedia PDF Downloads 97
15576 Academic Staff Recruitment in Islamic University: A Proposed Holistic Model

Authors: Syahruddin Sumardi, Indra Fajar Alamsyah, Junaidah Hashim

Abstract:

This study attempts to explore and presents a proposed recruitment model in Islamic university which aligned with holistic role. It is a conceptual paper in nature. In turn, this study is designed to utilize exploratory approach. Literature and document review that related to this topic are used as the methods to analyse the content found. Recruitment for any organization is fundamental to achieve its goal effectively. Staffing in universities is vital due to the importance role of lecturers. Currently, Islamic universities still adopt the common process of recruitment for their academic staffs. Whereas, they have own characteristics which are embedded in their institutions. Furthermore, the FCWC (Foundation, Capability, Worldview and Commitment) model of recruitment proposes to suit the holistic character of Islamic university. Further studies are required to empirically validate the concept through systematic investigations. Additionally, measuring this model by a designed means is appreciated. The model provides the map and alternative tool of recruitment for Islamic universities to determine the process of recruitment which can appropriate their institutions. In addition, it also allows stakeholders and policy makers to consider regarding Islamic values that should inculcate in the Islamic higher learning institutions. This study initiates a foundational contribution for an early sequence of research.

Keywords: academic staff, Islamic values, recruitment model, university

Procedia PDF Downloads 140
15575 Leadership Process Model: A Way to Provide Guidance in Dealing with the Key Challenges Within the Organisation

Authors: Rawaa El Ayoubi

Abstract:

Many researchers, academics and practitioners have developed leadership theories during the 20th century. This substantial effort has built more leadership theories, generating considerable organisational research on leadership models in contemporary literature. This paper explores the stages and drivers of leadership theory evolution based on the researcher’s personal conclusions and review of leadership theories. The purpose of this paper is to create a Leadership Process Model (LPM) that can provide guidance in dealing with the key challenges within the organisation. This integrative model of organisational leadership is based on inner meaning, leader values and vision. It further addresses the relationships between leadership theory, practice and development, exploring why challenges exist within the field of leadership theory and how these challenges can be mitigated.

Keywords: leadership challenges, leadership process model, leadership |theories, organisational leadership, paradigm development

Procedia PDF Downloads 57
15574 Wear Measuring and Wear Modelling Based On Archard, ASTM, and Neural Network Models

Authors: A. Shebani, C. Pislaru

Abstract:

Wear of materials is an everyday experience and has been observed and studied for long time. The prediction of wear is a fundamental problem in the industrial field, mainly correlated to the planning of maintenance interventions and economy. Pin-on-disc test is the most common test which is used to study the wear behaviour. In this paper, the pin-on-disc (AEROTECH UNIDEX 11) is used for the investigation of the effects of normal load and hardness of material on the wear under dry and sliding conditions. In the pin-on-disc rig, two specimens were used; one, a pin which is made of steel with a tip, is positioned perpendicular to the disc, where the disc is made of aluminium. The pin wear and disc wear were measured by using the following instruments: The Talysurf instrument, a digital microscope, and the alicona instrument; where the Talysurf profilometer was used to measure the pin/disc wear scar depth, and the alicona was used to measure the volume loss for pin and disc. After that, the Archard model, American Society for Testing and Materials model (ASTM), and neural network model were used for pin/disc wear modelling and the simulation results are implemented by using the Matlab program. This paper focuses on how the alicona can be considered as a powerful tool for wear measurements and how the neural network is an effective algorithm for wear estimation.

Keywords: wear modelling, Archard Model, ASTM Model, Neural Networks Model, Pin-on-disc Test, Talysurf, digital microscope, Alicona

Procedia PDF Downloads 428
15573 Procedure Model for Data-Driven Decision Support Regarding the Integration of Renewable Energies into Industrial Energy Management

Authors: M. Graus, K. Westhoff, X. Xu

Abstract:

The climate change causes a change in all aspects of society. While the expansion of renewable energies proceeds, industry could not be convinced based on general studies about the potential of demand side management to reinforce smart grid considerations in their operational business. In this article, a procedure model for a case-specific data-driven decision support for industrial energy management based on a holistic data analytics approach is presented. The model is executed on the example of the strategic decision problem, to integrate the aspect of renewable energies into industrial energy management. This question is induced due to considerations of changing the electricity contract model from a standard rate to volatile energy prices corresponding to the energy spot market which is increasingly more affected by renewable energies. The procedure model corresponds to a data analytics process consisting on a data model, analysis, simulation and optimization step. This procedure will help to quantify the potentials of sustainable production concepts based on the data from a factory. The model is validated with data from a printer in analogy to a simple production machine. The overall goal is to establish smart grid principles for industry via the transformation from knowledge-driven to data-driven decisions within manufacturing companies.

Keywords: data analytics, green production, industrial energy management, optimization, renewable energies, simulation

Procedia PDF Downloads 415
15572 Patented Free-Space Optical System for Auto Aligned Optical Beam Allowing to Compensate Mechanical Misalignments

Authors: Aurelien Boutin

Abstract:

In optical systems such as Variable Optical Delay Lines, where a collimated beam has to go back and forth, corner cubes are used in order to keep the reflected beam parallel to the incoming beam. However, the reflected beam can be laterally shifted, which will lead to losses. In this paper, we report on a patented optical design that allows keeping the reflected beam with the exact same position and direction whatever the displacement of the corner cube leading to zero losses. After explaining how the optical design works and theoretically allows to compensate for any defects in the translation of the corner cube, we will present the results of experimental comparisons between a standard layout (i.e., only corner cubes) and our optical layout. To compare both optical layouts, we used a fiber-to-fiber coupling setup. It consists of a couple of lights from one fiber to the other, thanks to two lenses. The ensemble [fiber+lense] is fixed and called a collimator so that the light is coupled from one collimator to another. Each collimator was precisely made in order to have a precise working distance. In the experiment, we measured and compared the Insertion Losses (IL) variations between both collimators with the distance between them (i.e., natural Gaussian beam coupling losses) and between both collimators in the different optical layouts tested, with the same optical length propagation. We will show that the IL variations of our setup are less than 0.05dB with respect to the IL variations of collimators alone.

Keywords: free-space optics, variable optical delay lines, optical cavity, auto-alignment

Procedia PDF Downloads 74
15571 The Non-Uniqueness of Partial Differential Equations Options Price Valuation Formula for Heston Stochastic Volatility Model

Authors: H. D. Ibrahim, H. C. Chinwenyi, T. Danjuma

Abstract:

An option is defined as a financial contract that provides the holder the right but not the obligation to buy or sell a specified quantity of an underlying asset in the future at a fixed price (called a strike price) on or before the expiration date of the option. This paper examined two approaches for derivation of Partial Differential Equation (PDE) options price valuation formula for the Heston stochastic volatility model. We obtained various PDE option price valuation formulas using the riskless portfolio method and the application of Feynman-Kac theorem respectively. From the results obtained, we see that the two derived PDEs for Heston model are distinct and non-unique. This establishes the fact of incompleteness in the model for option price valuation.

Keywords: Black-Scholes partial differential equations, Ito process, option price valuation, partial differential equations

Procedia PDF Downloads 119
15570 Robustness Analysis of the Carbon and Nitrogen Co-Metabolism Model of Mucor mucedo

Authors: Nahid Banihashemi

Abstract:

An emerging important area of the life sciences is systems biology, which involves understanding the integrated behavior of large numbers of components interacting via non-linear reaction terms. A centrally important problem in this area is an understanding of the co-metabolism of protein and carbohydrate, as it has been clearly demonstrated that the ratio of these metabolites in diet is a major determinant of obesity and related chronic disease. In this regard, we have considered a systems biology model for the co-metabolism of carbon and nitrogen in colonies of the fungus Mucor mucedo. Oscillations are an important diagnostic of underlying dynamical processes of this model. The maintenance of specific patterns of oscillation and its relation to the robustness of this system are the important issues which have been targeted in this paper. In this regard, parametric sensitivity approach as a theoretical approach has been considered for the analysis of the robustness of this model. As a result, the parameters of the model which produce the largest sensitivities have been identified. Furthermore, the largest changes that can be made in each parameter of the model without losing the oscillations in biomass production have been computed. The results are obtained from the implementation of parametric sensitivity analysis in Matlab.

Keywords: system biology, parametric sensitivity analysis, robustness, carbon and nitrogen co-metabolism, Mucor mucedo

Procedia PDF Downloads 299
15569 Upsetting of Tri-Metallic St-Cu-Al and St-Cu60Zn-Al Cylindrical Billets

Authors: Isik Cetintav, Cenk Misirli, Yilmaz Can

Abstract:

This work investigates upsetting of the tri-metallic cylindrical billets both experimentally and analytically with a reduction ratio 30%. Steel, brass, and copper are used for the outer and outmost rings and aluminum for the inner core. Two different models have been designed to show material flow and the cavity took place over the two interfaces during forming after this reduction ratio. Each model has an outmost ring material as steel. Model 1 has an outer ring between the outmost ring and the solid core material as copper and Model 2 has a material as brass. Solid core is aluminum for each model. Billets were upset in press machine by using parallel flat dies. Upsetting load was recorded and compared for models and single billets. To extend the tests and compare with experimental procedure to a wider range of inner core and outer ring geometries, finite element model was performed. ABAQUS software was used for the simulations. The aim is to show how contact between outmost ring, outer ring and the inner core are carried on throughout the upsetting process. Results have shown that, with changing in height, between outmost ring, outer ring and inner core, the Model 1 and Model 2 had very good interaction, and the contact surfaces of models had various interface behaviour. It is also observed that tri-metallic materials have lower weight but better mechanical properties than single materials. This can give an idea for using and producing these new materials for different purposes.

Keywords: tri-metallic, upsetting, copper, brass, steel, aluminum

Procedia PDF Downloads 321
15568 Development of Terrorist Threat Prediction Model in Indonesia by Using Bayesian Network

Authors: Hilya Mudrika Arini, Nur Aini Masruroh, Budi Hartono

Abstract:

There are more than 20 terrorist threats from 2002 to 2012 in Indonesia. Despite of this fact, preventive solution through studies in the field of national security in Indonesia has not been conducted comprehensively. This study aims to provide a preventive solution by developing prediction model of the terrorist threat in Indonesia by using Bayesian network. There are eight stages to build the model, started from literature review, build and verify Bayesian belief network to what-if scenario. In order to build the model, four experts from different perspectives are utilized. This study finds several significant findings. First, news and the readiness of terrorist group are the most influent factor. Second, according to several scenarios of the news portion, it can be concluded that the higher positive news proportion, the higher probability of terrorist threat will occur. Therefore, the preventive solution to reduce the terrorist threat in Indonesia based on the model is by keeping the positive news portion to a maximum of 38%.

Keywords: Bayesian network, decision analysis, national security system, text mining

Procedia PDF Downloads 368
15567 Electro-Hydrodynamic Analysis of Low-Pressure DC Glow Discharge by Lattice Boltzmann Method

Authors: Ji-Hyok Kim, Il-Gyong Paek, Yong-Jun Kim

Abstract:

We propose a numerical model based on drift-diffusion theory and lattice Boltzmann method (LBM) to analyze the electro-hydrodynamic behavior in low-pressure direct current (DC) glow discharge plasmas. We apply the drift-diffusion theory for 4-species and employ the standard lattice Boltzmann model (SLBM) for the electron, the finite difference-lattice Boltzmann model (FD-LBM) for heavy particles, and the finite difference model (FDM) for the electric potential, respectively. Our results are compared with those of other methods, and emphasize the necessity of a two-dimensional analysis for glow discharge.

Keywords: glow discharge, lattice Boltzmann method, numerical analysis, plasma simulation, electro-hydrodynamic

Procedia PDF Downloads 65
15566 TELUM Land Use Model: An Investigation of Data Requirements and Calibration Results for Chittenden County MPO, U.S.A.

Authors: Georgia Pozoukidou

Abstract:

TELUM software is a land use model designed specifically to help metropolitan planning organizations (MPOs) prepare their transportation improvement programs and fulfill their numerous planning responsibilities. In this context obtaining, preparing, and validating socioeconomic forecasts are becoming fundamental tasks for an MPO in order to ensure that consistent population and employment data are provided to travel demand models. Chittenden County Metropolitan Planning Organization of Vermont State was used as a case study to test the applicability of TELUM land use model. The technical insights and lessons learned from the land use model application have transferable value for all MPOs faced with land use forecasting development and transportation modelling.

Keywords: calibration data requirements, land use models, land use planning, metropolitan planning organizations

Procedia PDF Downloads 270
15565 Inference for Compound Truncated Poisson Lognormal Model with Application to Maximum Precipitation Data

Authors: M. Z. Raqab, Debasis Kundu, M. A. Meraou

Abstract:

In this paper, we have analyzed maximum precipitation data during a particular period of time obtained from different stations in the Global Historical Climatological Network of the USA. One important point to mention is that some stations are shut down on certain days for some reason or the other. Hence, the maximum values are recorded by excluding those readings. It is assumed that the number of stations that operate follows zero-truncated Poisson random variables, and the daily precipitation follows a lognormal random variable. We call this model a compound truncated Poisson lognormal model. The proposed model has three unknown parameters, and it can take a variety of shapes. The maximum likelihood estimators can be obtained quite conveniently using Expectation-Maximization (EM) algorithm. Approximate maximum likelihood estimators are also derived. The associated confidence intervals also can be obtained from the observed Fisher information matrix. Simulation results have been performed to check the performance of the EM algorithm, and it is observed that the EM algorithm works quite well in this case. When we analyze the precipitation data set using the proposed model, it is observed that the proposed model provides a better fit than some of the existing models.

Keywords: compound Poisson lognormal distribution, EM algorithm, maximum likelihood estimation, approximate maximum likelihood estimation, Fisher information, skew distribution

Procedia PDF Downloads 86
15564 A Predictive Machine Learning Model of the Survival of Female-led and Co-Led Small and Medium Enterprises in the UK

Authors: Mais Khader, Xingjie Wei

Abstract:

This research sheds light on female entrepreneurs by providing new insights on the survival predictions of companies led by females in the UK. This study aims to build a predictive machine learning model of the survival of female-led & co-led small & medium enterprises (SMEs) in the UK over the period 2000-2020. The predictive model built utilised a combination of financial and non-financial features related to both companies and their directors to predict SMEs' survival. These features were studied in terms of their contribution to the resultant predictive model. Five machine learning models are used in the modelling: Decision tree, AdaBoost, Naïve Bayes, Logistic regression and SVM. The AdaBoost model had the highest performance of the five models, with an accuracy of 73% and an AUC of 80%. The results show high feature importance in predicting companies' survival for company size, management experience, financial performance, industry, region, and females' percentage in management.

Keywords: company survival, entrepreneurship, females, machine learning, SMEs

Procedia PDF Downloads 65
15563 Domain-Specific Deep Neural Network Model for Classification of Abnormalities on Chest Radiographs

Authors: Nkechinyere Joy Olawuyi, Babajide Samuel Afolabi, Bola Ibitoye

Abstract:

This study collected a preprocessed dataset of chest radiographs and formulated a deep neural network model for detecting abnormalities. It also evaluated the performance of the formulated model and implemented a prototype of the formulated model. This was with the view to developing a deep neural network model to automatically classify abnormalities in chest radiographs. In order to achieve the overall purpose of this research, a large set of chest x-ray images were sourced for and collected from the CheXpert dataset, which is an online repository of annotated chest radiographs compiled by the Machine Learning Research Group, Stanford University. The chest radiographs were preprocessed into a format that can be fed into a deep neural network. The preprocessing techniques used were standardization and normalization. The classification problem was formulated as a multi-label binary classification model, which used convolutional neural network architecture to make a decision on whether an abnormality was present or not in the chest radiographs. The classification model was evaluated using specificity, sensitivity, and Area Under Curve (AUC) score as the parameter. A prototype of the classification model was implemented using Keras Open source deep learning framework in Python Programming Language. The AUC ROC curve of the model was able to classify Atelestasis, Support devices, Pleural effusion, Pneumonia, A normal CXR (no finding), Pneumothorax, and Consolidation. However, Lung opacity and Cardiomegaly had a probability of less than 0.5 and thus were classified as absent. Precision, recall, and F1 score values were 0.78; this implies that the number of False Positive and False Negative is the same, revealing some measure of label imbalance in the dataset. The study concluded that the developed model is sufficient to classify abnormalities present in chest radiographs into present or absent.

Keywords: transfer learning, convolutional neural network, radiograph, classification, multi-label

Procedia PDF Downloads 89