Search results for: nonlinear analytical model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19044

Search results for: nonlinear analytical model

16494 Numerical Study of Blackness Factor Effect on Dark Solitons

Authors: Khelil Khadidja

Abstract:

In this paper, blackness of dark solitons is considered. The exact combination between nonlinearity and dispersion is responsible of solitons stability. Dark solitons get born when dispersion is abnormal and balanced by nonlinearity, at the opposite of brillant solitons which is born by normal dispersion and nonlinearity together. Thanks to their stability, dark solitons are suitable for transmission by optical fibers. Dark solitons which are a solution of Nonlinear Schrodinger equation are simulated with Matlab to discuss the influence of coefficient of blackness. Results show that there is a direct proportion between the coefficient of blackness and the intensity of dark soliton. Those gray solitons are stable and convenient for transmission.

Keywords: abnormal dispersion, nonlinearity, optical fiber, soliton

Procedia PDF Downloads 198
16493 Catchment Yield Prediction in an Ungauged Basin Using PyTOPKAPI

Authors: B. S. Fatoyinbo, D. Stretch, O. T. Amoo, D. Allopi

Abstract:

This study extends the use of the Drainage Area Regionalization (DAR) method in generating synthetic data and calibrating PyTOPKAPI stream yield for an ungauged basin at a daily time scale. The generation of runoff in determining a river yield has been subjected to various topographic and spatial meteorological variables, which integers form the Catchment Characteristics Model (CCM). Many of the conventional CCM models adapted in Africa have been challenged with a paucity of adequate, relevance and accurate data to parameterize and validate the potential. The purpose of generating synthetic flow is to test a hydrological model, which will not suffer from the impact of very low flows or very high flows, thus allowing to check whether the model is structurally sound enough or not. The employed physically-based, watershed-scale hydrologic model (PyTOPKAPI) was parameterized with GIS-pre-processing parameters and remote sensing hydro-meteorological variables. The validation with mean annual runoff ratio proposes a decent graphical understanding between observed and the simulated discharge. The Nash-Sutcliffe efficiency and coefficient of determination (R²) values of 0.704 and 0.739 proves strong model efficiency. Given the current climate variability impact, water planner can now assert a tool for flow quantification and sustainable planning purposes.

Keywords: catchment characteristics model, GIS, synthetic data, ungauged basin

Procedia PDF Downloads 327
16492 Exploring Time-Series Phosphoproteomic Datasets in the Context of Network Models

Authors: Sandeep Kaur, Jenny Vuong, Marcel Julliard, Sean O'Donoghue

Abstract:

Time-series data are useful for modelling as they can enable model-evaluation. However, when reconstructing models from phosphoproteomic data, often non-exact methods are utilised, as the knowledge regarding the network structure, such as, which kinases and phosphatases lead to the observed phosphorylation state, is incomplete. Thus, such reactions are often hypothesised, which gives rise to uncertainty. Here, we propose a framework, implemented via a web-based tool (as an extension to Minardo), which given time-series phosphoproteomic datasets, can generate κ models. The incompleteness and uncertainty in the generated model and reactions are clearly presented to the user via the visual method. Furthermore, we demonstrate, via a toy EGF signalling model, the use of algorithmic verification to verify κ models. Manually formulated requirements were evaluated with regards to the model, leading to the highlighting of the nodes causing unsatisfiability (i.e. error causing nodes). We aim to integrate such methods into our web-based tool and demonstrate how the identified erroneous nodes can be presented to the user via the visual method. Thus, in this research we present a framework, to enable a user to explore phosphorylation proteomic time-series data in the context of models. The observer can visualise which reactions in the model are highly uncertain, and which nodes cause incorrect simulation outputs. A tool such as this enables an end-user to determine the empirical analysis to perform, to reduce uncertainty in the presented model - thus enabling a better understanding of the underlying system.

Keywords: κ-models, model verification, time-series phosphoproteomic datasets, uncertainty and error visualisation

Procedia PDF Downloads 255
16491 Development of Digital Twin Concept to Detect Abnormal Changes in Structural Behaviour

Authors: Shady Adib, Vladimir Vinogradov, Peter Gosling

Abstract:

Digital Twin (DT) technology is a new technology that appeared in the early 21st century. The DT is defined as the digital representation of living and non-living physical assets. By connecting the physical and virtual assets, data are transmitted smoothly, allowing the virtual asset to fully represent the physical asset. Although there are lots of studies conducted on the DT concept, there is still limited information about the ability of the DT models for monitoring and detecting unexpected changes in structural behaviour in real time. This is due to the large computational efforts required for the analysis and an excessively large amount of data transferred from sensors. This paper aims to develop the DT concept to be able to detect the abnormal changes in structural behaviour in real time using advanced modelling techniques, deep learning algorithms, and data acquisition systems, taking into consideration model uncertainties. finite element (FE) models were first developed offline to be used with a reduced basis (RB) model order reduction technique for the construction of low-dimensional space to speed the analysis during the online stage. The RB model was validated against experimental test results for the establishment of a DT model of a two-dimensional truss. The established DT model and deep learning algorithms were used to identify the location of damage once it has appeared during the online stage. Finally, the RB model was used again to identify the damage severity. It was found that using the RB model, constructed offline, speeds the FE analysis during the online stage. The constructed RB model showed higher accuracy for predicting the damage severity, while deep learning algorithms were found to be useful for estimating the location of damage with small severity.

Keywords: data acquisition system, deep learning, digital twin, model uncertainties, reduced basis, reduced order model

Procedia PDF Downloads 99
16490 Switched Uses of a Bidirectional Microphone as a Microphone and Sensors with High Gain and Wide Frequency Range

Authors: Toru Shionoya, Yosuke Kurihara, Takashi Kaburagi, Kajiro Watanabe

Abstract:

Mass-produced bidirectional microphones have attractive characteristics. They work as a microphone as well as a sensor with high gain over a wide frequency range; they are also highly reliable and economical. We present novel multiple functional uses of the microphones. A mathematical model for explaining the high-pass-filtering characteristics of bidirectional microphones was presented. Based on the model, the characteristics of the microphone were investigated, and a novel use for the microphone as a sensor with a wide frequency range was presented. In this study, applications for using the microphone as a security sensor and a human biosensor were introduced. The mathematical model was validated through experiments, and the feasibility of the abovementioned applications for security monitoring and the biosignal monitoring were examined through experiments.

Keywords: bidirectional microphone, low-frequency, mathematical model, frequency response

Procedia PDF Downloads 545
16489 Comparative Assessment of a Distributed Model and a Lumped Model for Estimating of Sediments Yielding in Small Urban Areas

Authors: J.Zambrano Nájera, M.Gómez Valentín

Abstract:

Increases in urbanization during XX century, have brought as one major problem the increased of sediment production. Hydraulic erosion is one of the major causes of increasing of sediments in small urban catchments. Such increments in sediment yielding in header urban catchments can caused obstruction of drainage systems, making impossible to capture urban runoff, increasing runoff volumes and thus exacerbating problems of urban flooding. For these reasons, it is more and more important to study of sediment production in urban watershed for properly analyze and solve problems associated to sediments. The study of sediments production has improved with the use of mathematical modeling. For that reason, it is proposed a new physically based model applicable to small header urban watersheds that includes the advantages of distributed physically base models, but with more realistic data requirements. Additionally, in this paper the model proposed is compared with a lumped model, reviewing the results, the advantages and disadvantages between the both of them.

Keywords: erosion, hydrologic modeling, urban runoff, sediment modeling, sediment yielding, urban planning

Procedia PDF Downloads 347
16488 Positioning a Southern Inclusive Framework Embedded in the Social Model of Disability Theory Contextualised for Guyana

Authors: Lidon Lashley

Abstract:

This paper presents how the social model of disability can be used to reshape inclusive education practices in Guyana. Inclusive education in Guyana is metamorphosizing but still firmly held in the tenets of the Medical Model of Disability which influences the experiences of children with Special Education Needs and/or Disabilities (SEN/D). An ethnographic approach to data gathering was employed in this study. Qualitative data was gathered from the voices of children with and without SEN/D as well as their mainstream teachers to present the interplay of discourses and subjectivities in the situation. The data was analyzed using Adele Clarke's postmodern approach to grounded theory analysis called situational analysis. The data suggest that it is possible but will be challenging to fully contextualize and adopt Loreman's synthesis and Booths and Ainscow's Index in the two mainstream schools studied. In addition, the data paved the way for the presentation of the social model framework specific to Guyana called 'Southern Inclusive Education Framework for Guyana' and its support tool called 'The Inclusive Checker created for Southern mainstream primary classrooms.

Keywords: social model of disability, medical model of disability, subjectivities, metamorphosis, special education needs, postcolonial Guyana, inclusion, culture, mainstream primary schools, Loreman's synthesis, Booths and Ainscow's index

Procedia PDF Downloads 162
16487 Analysis of the Diffusion Behavior of an Information and Communication Technology Platform for City Logistics

Authors: Giulio Mangano, Alberto De Marco, Giovanni Zenezini

Abstract:

The concept of City Logistics (CL) has emerged to improve the impacts of last mile freight distribution in urban areas. In this paper, a System Dynamics (SD) model exploring the dynamics of the diffusion of a ICT platform for CL management across different populations is proposed. For the development of the model two sources have been used. On the one hand, the major diffusion variables and feedback loops are derived from a literature review of existing diffusion models. On the other hand, the parameters are represented by the value propositions delivered by the platform as a response to some of the users’ needs. To extract the most important value propositions the Business Model Canvas approach has been used. Such approach in fact focuses on understanding how a company can create value for her target customers. These variables and parameters are thus translated into a SD diffusion model with three different populations namely municipalities, logistics service providers, and own account carriers. Results show that, the three populations under analysis fully adopt the platform within the simulation time frame, highlighting a strong demand by different stakeholders for CL projects aiming at carrying out more efficient urban logistics operations.

Keywords: city logistics, simulation, system dynamics, business model

Procedia PDF Downloads 266
16486 Evaluation of a Piecewise Linear Mixed-Effects Model in the Analysis of Randomized Cross-over Trial

Authors: Moses Mwangi, Geert Verbeke, Geert Molenberghs

Abstract:

Cross-over designs are commonly used in randomized clinical trials to estimate efficacy of a new treatment with respect to a reference treatment (placebo or standard). The main advantage of using cross-over design over conventional parallel design is its flexibility, where every subject become its own control, thereby reducing confounding effect. Jones & Kenward, discuss in detail more recent developments in the analysis of cross-over trials. We revisit the simple piecewise linear mixed-effects model, proposed by Mwangi et. al, (in press) for its first application in the analysis of cross-over trials. We compared performance of the proposed piecewise linear mixed-effects model with two commonly cited statistical models namely, (1) Grizzle model; and (2) Jones & Kenward model, used in estimation of the treatment effect, in the analysis of randomized cross-over trial. We estimate two performance measurements (mean square error (MSE) and coverage probability) for the three methods, using data simulated from the proposed piecewise linear mixed-effects model. Piecewise linear mixed-effects model yielded lowest MSE estimates compared to Grizzle and Jones & Kenward models for both small (Nobs=20) and large (Nobs=600) sample sizes. It’s coverage probability were highest compared to Grizzle and Jones & Kenward models for both small and large sample sizes. A piecewise linear mixed-effects model is a better estimator of treatment effect than its two competing estimators (Grizzle and Jones & Kenward models) in the analysis of cross-over trials. The data generating mechanism used in this paper captures two time periods for a simple 2-Treatments x 2-Periods cross-over design. Its application is extendible to more complex cross-over designs with multiple treatments and periods. In addition, it is important to note that, even for single response models, adding more random effects increases the complexity of the model and thus may be difficult or impossible to fit in some cases.

Keywords: Evaluation, Grizzle model, Jones & Kenward model, Performance measures, Simulation

Procedia PDF Downloads 122
16485 Analysis of Production Forecasting in Unconventional Gas Resources Development Using Machine Learning and Data-Driven Approach

Authors: Dongkwon Han, Sangho Kim, Sunil Kwon

Abstract:

Unconventional gas resources have dramatically changed the future energy landscape. Unlike conventional gas resources, the key challenges in unconventional gas have been the requirement that applies to advanced approaches for production forecasting due to uncertainty and complexity of fluid flow. In this study, artificial neural network (ANN) model which integrates machine learning and data-driven approach was developed to predict productivity in shale gas. The database of 129 wells of Eagle Ford shale basin used for testing and training of the ANN model. The Input data related to hydraulic fracturing, well completion and productivity of shale gas were selected and the output data is a cumulative production. The performance of the ANN using all data sets, clustering and variables importance (VI) models were compared in the mean absolute percentage error (MAPE). ANN model using all data sets, clustering, and VI were obtained as 44.22%, 10.08% (cluster 1), 5.26% (cluster 2), 6.35%(cluster 3), and 32.23% (ANN VI), 23.19% (SVM VI), respectively. The results showed that the pre-trained ANN model provides more accurate results than the ANN model using all data sets.

Keywords: unconventional gas, artificial neural network, machine learning, clustering, variables importance

Procedia PDF Downloads 196
16484 Development of a Thermodynamic Model for Ladle Metallurgy Steel Making Processes Using Factsage and Its Macro Facility

Authors: Prasenjit Singha, Ajay Kumar Shukla

Abstract:

To produce high-quality steel in larger volumes, dynamic control of composition and temperature throughout the process is essential. In this paper, we developed a mass transfer model based on thermodynamics to simulate the ladle metallurgy steel-making process using FactSage and its macro facility. The overall heat and mass transfer processes consist of one equilibrium chamber, two non-equilibrium chambers, and one adiabatic reactor. The flow of material, as well as heat transfer, occurs across four interconnected unit chambers and a reactor. We used the macro programming facility of FactSage™ software to understand the thermochemical model of the secondary steel making process. In our model, we varied the oxygen content during the process and studied their effect on the composition of the final hot metal and slag. The model has been validated with respect to the plant data for the steel composition, which is similar to the ladle metallurgy steel-making process in the industry. The resulting composition profile serves as a guiding tool to optimize the process of ladle metallurgy in steel-making industries.

Keywords: desulphurization, degassing, factsage, reactor

Procedia PDF Downloads 217
16483 Biaxial Fatigue Specimen Design and Testing Rig Development

Authors: Ahmed H. Elkholy

Abstract:

An elastic analysis is developed to obtain the distribution of stresses, strains, bending moment and deformation for a thin hollow, variable thickness cylindrical specimen when subjected to different biaxial loadings. The specimen was subjected to a combination of internal pressure, axial tensile loading and external pressure. Several axial to circumferential stress ratios were investigated in detail. The analytical model was then validated using experimental results obtained from a test rig using several biaxial loadings. Based on the preliminary results obtained, the specimen was then modified geometrically to ensure uniform strain distribution through its wall thickness and along its gauge length. The new design of the specimen has a higher buckling strength and a maximum value of equivalent stress according to the maximum distortion energy theory. A cyclic function generator of the standard servo-controlled, electro-hydraulic testing machine is used to generate a specific signal shape (sine, square,…) at a certain frequency. The two independent controllers of the electronic circuit cause an independent movement to each servo-valve piston. The movement of each piston pressurizes the upper and lower sides of the actuators alternately. So, the specimen will be subjected to axial and diametral loads independent of each other. The hydraulic system has two different pressures: one pressure will be responsible for axial stress produced in the specimen and the other will be responsible for the tangential stress. Changing the two pressure ratios will change the stress ratios accordingly. The only restriction on the maximum stress obtained is the capacity of the testing system and specimen instability due to buckling.

Keywords: biaxial, fatigue, stress, testing

Procedia PDF Downloads 128
16482 Temperament and Character Dimensions as Personality Predictors of Relationship Quality: An Actor-Partner Interdependence Model

Authors: Dora Vajda, Somayyeh Mohammadi, Sandor Rozsa

Abstract:

Predicting the relationship satisfaction based on the personality characteristics of both partners has a long history. The association between relationship quality and personality traits has been previously demonstrated. Personality traits are most commonly assessed using the Five-Factor Model. The present study has focused on Cloninger's psychobiological model of personality that accounts for dimensions of both temperament and character. The goal of this study was to examine the actor and partner effect of couple's personality on relationship outcomes. In total, 184 heterosexual couples completed the Temperament and Character Inventory (TCI) and the Dyadic Adjustment Scale. The analysis was based on Actor-Partner Interdependence Model (APIM) using multilevel modeling (MLwiN). Results showed that character dimensions Self-Directedness and Cooperativeness had a statistically meaningful actor and partner effect on both partner's relationship quality. However, male's personality temperament dimension Reward Dependence had an only actor effect on his relationship quality. The findings contribute to the literature by highlighting the role of character dimensions of personality in romantic relationships.

Keywords: APIM (actor-partner interdependence model), MLwiN, personality, relationship quality

Procedia PDF Downloads 414
16481 Flexible Capacitive Sensors Based on Paper Sheets

Authors: Mojtaba Farzaneh, Majid Baghaei Nejad

Abstract:

This article proposes a new Flexible Capacitive Tactile Sensors based on paper sheets. This method combines the parameters of sensor's material and dielectric, and forms a new model of flexible capacitive sensors. The present article tries to present a practical explanation of this method's application and advantages. With the use of this new method, it is possible to make a more flexibility and accurate sensor in comparison with the current models. To assess the performance of this model, the common capacitive sensor is simulated and the proposed model of this article and one of the existing models are assessed. The results of this article indicate that the proposed model of this article can enhance the speed and accuracy of tactile sensor and has less error in comparison with the current models. Based on the results of this study, it can be claimed that in comparison with the current models, the proposed model of this article is capable of representing more flexibility and more accurate output parameters for touching the sensor, especially in abnormal situations and uneven surfaces, and increases accuracy and practicality.

Keywords: capacitive sensor, paper sheets, flexible, tactile, uneven

Procedia PDF Downloads 353
16480 Simulation of Bird Strike on Airplane Wings by Using SPH Methodology

Authors: Tuğçe Kiper Elibol, İbrahim Uslan, Mehmet Ali Guler, Murat Buyuk, Uğur Yolum

Abstract:

According to the FAA report, 142603 bird strikes were reported for a period of 24 years, between 1990 – 2013. Bird strike with aerospace structures not only threaten the flight security but also cause financial loss and puts life in danger. The statistics show that most of the bird strikes are happening with the nose and the leading edge of the wings. Also, a substantial amount of bird strikes is absorbed by the jet engines and causes damage on blades and engine body. Crash proof designs are required to overcome the possibility of catastrophic failure of the airplane. Using computational methods for bird strike analysis during the product development phase has considerable importance in terms of cost saving. Clearly, using simulation techniques to reduce the number of reference tests can dramatically affect the total cost of an aircraft, where for bird strike often full-scale tests are considered. Therefore, development of validated numerical models is required that can replace preliminary tests and accelerate the design cycle. In this study, to verify the simulation parameters for a bird strike analysis, several different numerical options are studied for an impact case against a primitive structure. Then, a representative bird mode is generated with the verified parameters and collided against the leading edge of a training aircraft wing, where each structural member of the wing was explicitly modeled. A nonlinear explicit dynamics finite element code, LS-DYNA was used for the bird impact simulations. SPH methodology was used to model the behavior of the bird. Dynamic behavior of the wing superstructure was observed and will be used for further design optimization purposes.

Keywords: bird impact, bird strike, finite element modeling, smoothed particle hydrodynamics

Procedia PDF Downloads 327
16479 Creative Peace Diplomacy Model by the Perspective of Dialogue Management for International Relations

Authors: Bilgehan Gültekin, Tuba Gültekin

Abstract:

Peace diplomacy is the most important international tool to keep peace all over the world. The study titled “peace diplomacy for international relations” is consist of three part. In the first part, peace diplomacy is going to be introduced as a tool of peace communication and peace management. And, in this part, peace communication will be explained by international communication perspective. In the second part of the study,public relations events and communication campaigns will be developed originally for peace diplomacy. In this part, it is aimed original public communication dialogue management tools for peace diplomacy. the aim of the final part of the study, is to produce original public communication model for international relations. The model includes peace modules, peace management projects, original dialogue procedures and protocols, dialogue education, dialogue management strategies, peace actors, communication models, peace team management and public diplomacy steps. The creative part of the study aims to develop a model used for international relations for all countries. Creative Peace Diplomacy Model will be developed in the case of Turkey-Turkey-France and Turkey-Greece relations. So, communication and public relations events and campaigns are going to be developed as original for only this study.

Keywords: peace diplomacy, public communication model, dialogue management, international relations

Procedia PDF Downloads 541
16478 A Fuzzy Multiobjective Model for Bed Allocation Optimized by Artificial Bee Colony Algorithm

Authors: Jalal Abdulkareem Sultan, Abdulhakeem Luqman Hasan

Abstract:

With the development of health care systems competition, hospitals face more and more pressures. Meanwhile, resource allocation has a vital effect on achieving competitive advantages in hospitals. Selecting the appropriate number of beds is one of the most important sections in hospital management. However, in real situation, bed allocation selection is a multiple objective problem about different items with vagueness and randomness of the data. It is very complex. Hence, research about bed allocation problem is relatively scarce under considering multiple departments, nursing hours, and stochastic information about arrival and service of patients. In this paper, we develop a fuzzy multiobjective bed allocation model for overcoming uncertainty and multiple departments. Fuzzy objectives and weights are simultaneously applied to help the managers to select the suitable beds about different departments. The proposed model is solved by using Artificial Bee Colony (ABC), which is a very effective algorithm. The paper describes an application of the model, dealing with a public hospital in Iraq. The results related that fuzzy multi-objective model was presented suitable framework for bed allocation and optimum use.

Keywords: bed allocation problem, fuzzy logic, artificial bee colony, multi-objective optimization

Procedia PDF Downloads 324
16477 Prevalence and Factors Associated to Work Accidents in the Construction Sector in Benin: Cases of CFIR – Consulting

Authors: Antoine Vikkey Hinson, Menonli Adjobimey, Gemayel Ahmed Biokou, Rose Mikponhoue

Abstract:

Introduction: Construction industry is a critical concern with regard to Health and Safety Service worldwide. World health Organization revealed that work-related disease and trauma were held responsible for the death of one million nine hundred thousand people in 2016. The aim of this study it was to determine the prevalence and factors associated with the occurrence of work accidents in a construction industry in Benin. Method: It was a descriptive cross-sectional and analytical study. Data analysis was performed with R software 4.1.1. In multivariate analysis, we performed a binary logistic regression. OR adjusted (ORa) association measures and their 95% confidence interval [CI95%] were presented for the explanatory variables used in the final model. The significance threshold for all tests selected was 5% (p < 0.05) Result: In this study, 472 workers were included, and, of these, 452 (95.7%) were men corresponding to a sex ratio of 22.6. The average age of the workers was 33 years ± 8.8 years. Workers were mostly laborers (84.7%), and had declared having inadequate personal protective equipment (50.6%, n=239). The prevalence of work accidents is 50.8%. Collision with a rolling stock (25.8%), cut (16.2%), and stumbling (16.2%) were the main types of work accidents on the construction site. Four factors were associated with contributing to work accidents. Fatigue or exhaustion (ORa : 1.53[1.03 ; 2.28]); The use of dangerous tools (ORa : 1.81 [1.22 ; 2.71]); The various laborers’ jobs (ORa : 4.78 [2.62 ; 9.21]); and seniority in the company ≥ 4 years (ORa : 2.00 [1.35 ; 2.96]). Conclusion: This study allowed us to identify the associated factors. It is imperative to implement a rigorous policy of occupational health and security mostly the continuing training for workers safe, the supply of appropriate work tools and protective

Keywords: prevalence, work accident, associated factors, construction, benin

Procedia PDF Downloads 57
16476 Nonlinear Optical Properties for Three Level Atoms at Resonance and Off-Resonance with Laser Coupled Beams

Authors: Suad M. Abuzariba, Eman O. Mafaa

Abstract:

For three level atom interacts with a laser beam, the effect of changing resonance and off-resonance frequencies has been studied. Furthermore, a clear distortion has been seen in both the real and imaginary parts of the electric susceptibility with increasing the frequency of the coupled laser beams so that reaching the off-resonance interaction. With increasing the Rabi frequency of the laser pulse that in resonance with the lower transition the distortion will produce a new peak in the electric susceptibility parts, in both the real and imaginary ones.

Keywords: electric susceptibility, resonance frequency off-resonance frequency, three level atom, laser

Procedia PDF Downloads 311
16475 The Six 'P' Model: Principles of Inclusive Practice for Inclusion Coaches

Authors: Tiffany Gallagher, Sheila Bennett

Abstract:

Based on data from a larger study, this research is based in a small school district in Ontario, Canada, that has made a transition from self-contained classes for students with exceptionalities to inclusive classroom placements for all students with their age-appropriate peers. The school board aided this transition by hiring Inclusion Coaches with a background in special education to work alongside teachers as partners and inform their inclusive practice. Based on qualitative data from four focus groups conducted with Inclusion Coaches, as well as four blog-style reflections collected at various points over two years, six principles of inclusive practice were identified for coaches. The six principles form a model during transition: pre-requisite, process, precipice, promotion, proof, and promise. These principles are encapsulated in a visual model of a spiraling staircase displaying the conditions that exist prior to coaching, during coaching interactions and considerations for the sustainability of coaching. These six principles are re-iterative and should be re-visited each time a coaching interaction is initiated. Exploring inclusion coaching as a model emulates coaching in other contexts and allows us to examine an established process through a new lens. This research becomes increasingly important as more school boards transition toward inclusive classrooms, The Six ‘P’ Model: Principles of Inclusive Practice for Inclusion Coaches allows for a unique look into a scaffolding model of building educator capacity in an inclusive setting.

Keywords: capacity building, coaching, inclusion, special education

Procedia PDF Downloads 248
16474 Space Tourism Pricing Model Revolution from Time Independent Model to Time-Space Model

Authors: Kang Lin Peng

Abstract:

Space tourism emerged in 2001 and became famous in 2021, following the development of space technology. The space market is twisted because of the excess demand. Space tourism is currently rare and extremely expensive, with biased luxury product pricing, which is the seller’s market that consumers can not bargain with. Spaceship companies such as Virgin Galactic, Blue Origin, and Space X have been charged space tourism prices from 200 thousand to 55 million depending on various heights in space. There should be a reasonable price based on a fair basis. This study aims to derive a spacetime pricing model, which is different from the general pricing model on the earth’s surface. We apply general relativity theory to deduct the mathematical formula for the space tourism pricing model, which covers the traditional time-independent model. In the future, the price of space travel will be different from current flight travel when space travel is measured in lightyear units. The pricing of general commodities mainly considers the general equilibrium of supply and demand. The pricing model considers risks and returns with the dependent time variable as acceptable when commodities are on the earth’s surface, called flat spacetime. Current economic theories based on the independent time scale in the flat spacetime do not consider the curvature of spacetime. Current flight services flying the height of 6, 12, and 19 kilometers are charging with a pricing model that measures time coordinate independently. However, the emergence of space tourism is flying heights above 100 to 550 kilometers that have enlarged the spacetime curvature, which means tourists will escape from a zero curvature on the earth’s surface to the large curvature of space. Different spacetime spans should be considered in the pricing model of space travel to echo general relativity theory. Intuitively, this spacetime commodity needs to consider changing the spacetime curvature from the earth to space. We can assume the value of each spacetime curvature unit corresponding to the gradient change of each Ricci or energy-momentum tensor. Then we know how much to spend by integrating the spacetime from the earth to space. The concept is adding a price p component corresponding to the general relativity theory. The space travel pricing model degenerates into a time-independent model, which becomes a model of traditional commodity pricing. The contribution is that the deriving of the space tourism pricing model will be a breakthrough in philosophical and practical issues for space travel. The results of the space tourism pricing model extend the traditional time-independent flat spacetime mode. The pricing model embedded spacetime as the general relativity theory can better reflect the rationality and accuracy of space travel on the universal scale. The universal scale from independent-time scale to spacetime scale will bring a brand-new pricing concept for space traveling commodities. Fair and efficient spacetime economics will also bring to humans’ travel when we can travel in lightyear units in the future.

Keywords: space tourism, spacetime pricing model, general relativity theory, spacetime curvature

Procedia PDF Downloads 128
16473 Invasive Ranges of Gorse (Ulex europaeus) in South Australia and Sri Lanka Using Species Distribution Modelling

Authors: Champika S. Kariyawasam

Abstract:

The distribution of gorse (Ulex europaeus) plants in South Australia has been modelled using 126 presence-only location data as a function of seven climate parameters. The predicted range of U. europaeus is mainly along the Mount Lofty Ranges in the Adelaide Hills and on Kangaroo Island. Annual precipitation and yearly average aridity index appeared to be the highest contributing variables to the final model formulation. The Jackknife procedure was employed to identify the contribution of different variables to gorse model outputs and response curves were used to predict changes with changing environmental variables. Based on this analysis, it was revealed that the combined effect of one or more variables could make a completely different impact to the original variables on their own to the model prediction. This work also demonstrates the need for a careful approach when selecting environmental variables for projecting correlative models to climatically distinct area. Maxent acts as a robust model when projecting the fitted species distribution model to another area with changing climatic conditions, whereas the generalized linear model, bioclim, and domain models to be less robust in this regard. These findings are important not only for predicting and managing invasive alien gorse in South Australia and Sri Lanka but also in other countries of the invasive range.

Keywords: invasive species, Maxent, species distribution modelling, Ulex europaeus

Procedia PDF Downloads 134
16472 Elastoplastic and Ductile Damage Model Calibration of Steels for Bolt-Sphere Joints Used in China’s Space Structure Construction

Authors: Huijuan Liu, Fukun Li, Hao Yuan

Abstract:

The bolted spherical node is a common type of joint in space steel structures. The bolt-sphere joint portion almost always controls the bearing capacity of the bolted spherical node. The investigation of the bearing performance and progressive failure in service often requires high-fidelity numerical models. This paper focuses on the constitutive models of bolt steel and sphere steel used in China’s space structure construction. The elastoplastic model is determined by a standard tensile test and calibrated Voce saturated hardening rule. The ductile damage is found dominant based on the fractography analysis. Then Rice-Tracey ductile fracture rule is selected and the model parameters are calibrated based on tensile tests of notched specimens. These calibrated material models can benefit research or engineering work in similar fields.

Keywords: bolt-sphere joint, steel, constitutive model, ductile damage, model calibration

Procedia PDF Downloads 136
16471 Exact and Approximate Controllability of Nuclear Dynamics Using Bilinear Controls

Authors: Ramdas Sonawane, Mahaveer Gadiya

Abstract:

The control problem associated with nuclear dynamics is represented by nonlinear integro-differential equation with additive controls. To control chain reaction, certain amount of neutrons is added into (or withdrawn out of) chamber as and when required. It is not realistic. So, we can think of controlling the reactor dynamics by bilinear control, which enters the system as coefficient of state. In this paper, we study the approximate and exact controllability of parabolic integro-differential equation controlled by bilinear control with non-homogeneous boundary conditions in bounded domain. We prove the existence of control and propose an explicit control strategy.

Keywords: approximate control, exact control, bilinear control, nuclear dynamics, integro-differential equations

Procedia PDF Downloads 444
16470 The Comparison of of Stress Level between Students with Parents and Those without Parents

Authors: Hendeh Majdi, Zahra Arzjani

Abstract:

This research aimed at the comparison of level of stress between students had parents and those without parents by descriptive-analytical study. To do research number of 128 questionnaires (64 students with parents and 64 students without parents) were distributed among high school in Ray city, Tehran province through classified sampling. The results showed that level of stress in stud tent without parents has been effective and the most important proposal is that necessity study should be considered in decreasing level of stress in students without parent.

Keywords: stress, students with parents, without parents, Ray city

Procedia PDF Downloads 499
16469 A Comparative Study of Generalized Autoregressive Conditional Heteroskedasticity (GARCH) and Extreme Value Theory (EVT) Model in Modeling Value-at-Risk (VaR)

Authors: Longqing Li

Abstract:

The paper addresses the inefficiency of the classical model in measuring the Value-at-Risk (VaR) using a normal distribution or a Student’s t distribution. Specifically, the paper focuses on the one day ahead Value-at-Risk (VaR) of major stock market’s daily returns in US, UK, China and Hong Kong in the most recent ten years under 95% confidence level. To improve the predictable power and search for the best performing model, the paper proposes using two leading alternatives, Extreme Value Theory (EVT) and a family of GARCH models, and compares the relative performance. The main contribution could be summarized in two aspects. First, the paper extends the GARCH family model by incorporating EGARCH and TGARCH to shed light on the difference between each in estimating one day ahead Value-at-Risk (VaR). Second, to account for the non-normality in the distribution of financial markets, the paper applies Generalized Error Distribution (GED), instead of the normal distribution, to govern the innovation term. A dynamic back-testing procedure is employed to assess the performance of each model, a family of GARCH and the conditional EVT. The conclusion is that Exponential GARCH yields the best estimate in out-of-sample one day ahead Value-at-Risk (VaR) forecasting. Moreover, the discrepancy of performance between the GARCH and the conditional EVT is indistinguishable.

Keywords: Value-at-Risk, Extreme Value Theory, conditional EVT, backtesting

Procedia PDF Downloads 321
16468 Sound Performance of a Composite Acoustic Coating With Embedded Parallel Plates Under Hydrostatic Pressure

Authors: Bo Hu, Shibo Wang, Haoyang Zhang, Jie Shi

Abstract:

With the development of sonar detection technology, the acoustic stealth technology of underwater vehicles is facing severe challenges. The underwater acoustic coating is developing towards the direction of low-frequency absorption capability and broad absorption frequency bandwidth. In this paper, an acoustic model of underwater acoustic coating of composite material embedded with periodical steel structure is presented. The model has multiple high absorption peaks in the frequency range of 1kHz-8kHz, where achieves high sound absorption and broad bandwidth performance. It is found that the frequencies of the absorption peaks are related to the classic half-wavelength transmission principle. The sound absorption performance of the acoustic model is investigated by the finite element method using COMSOL software. The sound absorption mechanism of the proposed model is explained by the distributions of the displacement vector field. The influence of geometric parameters of periodical steel structure, including thickness and distance, on the sound absorption ability of the proposed model are further discussed. The acoustic model proposed in this study provides an idea for the design of underwater low-frequency broadband acoustic coating, and the results shows the possibility and feasibility for practical underwater application.

Keywords: acoustic coating, composite material, broad frequency bandwidth, sound absorption performance

Procedia PDF Downloads 174
16467 LGG Architecture for Brain Tumor Segmentation Using Convolutional Neural Network

Authors: Sajeeha Ansar, Asad Ali Safi, Sheikh Ziauddin, Ahmad R. Shahid, Faraz Ahsan

Abstract:

The most aggressive form of brain tumor is called glioma. Glioma is kind of tumor that arises from glial tissue of the brain and occurs quite often. A fully automatic 2D-CNN model for brain tumor segmentation is presented in this paper. We performed pre-processing steps to remove noise and intensity variances using N4ITK and standard intensity correction, respectively. We used Keras open-source library with Theano as backend for fast implementation of CNN model. In addition, we used BRATS 2015 MRI dataset to evaluate our proposed model. Furthermore, we have used SimpleITK open-source library in our proposed model to analyze images. Moreover, we have extracted random 2D patches for proposed 2D-CNN model for efficient brain segmentation. Extracting 2D patched instead of 3D due to less dimensional information present in 2D which helps us in reducing computational time. Dice Similarity Coefficient (DSC) is used as performance measure for the evaluation of the proposed method. Our method achieved DSC score of 0.77 for complete, 0.76 for core, 0.77 for enhanced tumor regions. However, these results are comparable with methods already implemented 2D CNN architecture.

Keywords: brain tumor segmentation, convolutional neural networks, deep learning, LGG

Procedia PDF Downloads 182
16466 Machine Learning Data Architecture

Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap

Abstract:

Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.

Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning

Procedia PDF Downloads 64
16465 Data-Driven Dynamic Overbooking Model for Tour Operators

Authors: Kannapha Amaruchkul

Abstract:

We formulate a dynamic overbooking model for a tour operator, in which most reservations contain at least two people. The cancellation rate and the timing of the cancellation may depend on the group size. We propose two overbooking policies, namely economic- and service-based. In an economic-based policy, we want to minimize the expected oversold and underused cost, whereas, in a service-based policy, we ensure that the probability of an oversold situation does not exceed the pre-specified threshold. To illustrate the applicability of our approach, we use tour package data in 2016-2018 from a tour operator in Thailand to build a data-driven robust optimization model, and we tested the proposed overbooking policy in 2019. We also compare the data-driven approach to the conventional approach of fitting data into a probability distribution.

Keywords: applied stochastic model, data-driven robust optimization, overbooking, revenue management, tour operator

Procedia PDF Downloads 134