Search results for: modelling approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14631

Search results for: modelling approach

14301 Differences Choosing Closed Approach or Open Approach in Rhinoplasty Outcomes

Authors: Alessandro Marano

Abstract:

Aim: The author describes a strategy for choosing between two different rhinoplasty approaches for outcomes treatment. Methods: Series of the case study. There are advantages and disadvantages on both approaches for rhinoplasty. On the side of the open approach, we are be able to better manage the techniques for shaping and restoring nasal structures in rhinoplasty outcomes; on the other side, the closed approach requires more practice and experience to achieve good results. Results: Author’s choice is the closed approach on rhinoplasty outcomes. Anyway, the open approach is most commonly preferred due to superior management and better vision on nasal structures. Conclusions: Both approaches are valid for the treatment of rhinoplasty outcomes, author's preferred approach is closed, with minimally invasive modification focused on restoring outcomes in nasal function and aesthetics.

Keywords: rhinoplasty, aesthetic, face, outcomes

Procedia PDF Downloads 91
14300 Quality Assessment of New Zealand Mānuka Honeys Using Hyperspectral Imaging Combined with Deep 1D-Convolutional Neural Networks

Authors: Hien Thi Dieu Truong, Mahmoud Al-Sarayreh, Pullanagari Reddy, Marlon M. Reis, Richard Archer

Abstract:

New Zealand mānuka honey is a honeybee product derived mainly from Leptospermum scoparium nectar. The potent antibacterial activity of mānuka honey derives principally from methylglyoxal (MGO), in addition to the hydrogen peroxide and other lesser activities present in all honey. MGO is formed from dihydroxyacetone (DHA) unique to L. scoparium nectar. Mānuka honey also has an idiosyncratic phenolic profile that is useful as a chemical maker. Authentic mānuka honey is highly valuable, but almost all honey is formed from natural mixtures of nectars harvested by a hive over a time period. Once diluted by other nectars, mānuka honey irrevocably loses value. We aimed to apply hyperspectral imaging to honey frames before bulk extraction to minimise the dilution of genuine mānuka by other honey and ensure authenticity at the source. This technology is non-destructive and suitable for an industrial setting. Chemometrics using linear Partial Least Squares (PLS) and Support Vector Machine (SVM) showed limited efficacy in interpreting chemical footprints due to large non-linear relationships between predictor and predictand in a large sample set, likely due to honey quality variability across geographic regions. Therefore, an advanced modelling approach, one-dimensional convolutional neural networks (1D-CNN), was investigated for analysing hyperspectral data for extraction of biochemical information from honey. The 1D-CNN model showed superior prediction of honey quality (R² = 0.73, RMSE = 2.346, RPD= 2.56) to PLS (R² = 0.66, RMSE = 2.607, RPD= 1.91) and SVM (R² = 0.67, RMSE = 2.559, RPD=1.98). Classification of mono-floral manuka honey from multi-floral and non-manuka honey exceeded 90% accuracy for all models tried. Overall, this study reveals the potential of HSI and deep learning modelling for automating the evaluation of honey quality in frames.

Keywords: mānuka honey, quality, purity, potency, deep learning, 1D-CNN, chemometrics

Procedia PDF Downloads 116
14299 Modelling of Rate-Dependent Hysteresis of Polypyrrole Dual Sensing-Actuators for Precise Position Control

Authors: Johanna Schumacher, Toribio F. Otero, Victor H. Pascual

Abstract:

Bending dual sensing-actuators based on electroactive polymers are faradaic motors meaning the consumed charge determines the actuator’s tip position. During actuation, consumed charges during oxidation and reduction result in different tip positions showing dynamic hysteresis effects with errors up to 25%. For a precise position control of these actuators, the characterization of the hysteresis effect due to irreversible reactions is crucial. Here, the investigation and modelling of dynamic hysteresis effects of polypyrrole-dodezylbenzenesulfonate (PPyDBS) actuators under ambient working conditions are presented. The hysteresis effect is studied for charge consumption at different frequencies and a rate-dependent hysteresis model is derived. The hysteresis model is implemented as closed loop system and is verified experimentally.

Keywords: dual sensing-actuator, electroactive polymers, hysteresis, position control

Procedia PDF Downloads 370
14298 2D RF ICP Torch Modelling with Fluid Plasma

Authors: Mokhtar Labiod, Nabil Ikhlef, Keltoum Bouherine, Olivier Leroy

Abstract:

A numerical model for the radio-frequency (RF) Argon discharge chamber is developed to simulate the low pressure low temperature inductively coupled plasma. This model will be of fundamental importance in the design of the plasma magnetic control system. Electric and magnetic fields inside the discharge chamber are evaluated by solving a magnetic vector potential equation. To start with, the equations of the ideal magnetohydrodynamics theory will be presented describing the basic behaviour of magnetically confined plasma and equations are discretized with finite element method in cylindrical coordinates. The discharge chamber is assumed to be axially symmetric and the plasma is treated as a compressible gas. Plasma generation due to ionization is added to the continuity equation. Magnetic vector potential equation is solved for the electromagnetic fields. A strong dependence of the plasma properties on the discharge conditions and the gas temperature is obtained.

Keywords: direct-coupled model, magnetohydrodynamic, modelling, plasma torch simulation

Procedia PDF Downloads 414
14297 Choral Singers' Preference for Expressive Priming Techniques

Authors: Shawn Michael Condon

Abstract:

Current research on teaching expressivity mainly involves instrumentalists. This study focuses on choral singers’ preference of priming techniques based on four methods for teaching expressivity. 112 choral singers answered the survey about their preferred methods for priming expressivity (vocal modelling, using metaphor, tapping into felt emotions, and drawing on past experiences) in three conditions (active, passive, and instructor). Analysis revealed higher preference for drawing on past experience among more experienced singers. The most preferred technique in the passive and instructor roles was vocal modelling, with metaphors and tapping into felt emotions favoured in an active role. Priming techniques are often used in combination with other methods to enhance singing technique or expressivity and are dependent upon the situation, repertoire, and the preferences of the instructor and performer.

Keywords: emotion, expressivity, performance, singing, teaching

Procedia PDF Downloads 139
14296 Stochastic Pi Calculus in Financial Markets: An Alternate Approach to High Frequency Trading

Authors: Jerome Joshi

Abstract:

The paper presents the modelling of financial markets using the Stochastic Pi Calculus model. The Stochastic Pi Calculus model is mainly used for biological applications; however, the feature of this model promotes its use in financial markets, more prominently in high frequency trading. The trading system can be broadly classified into exchange, market makers or intermediary traders and fundamental traders. The exchange is where the action of the trade is executed, and the two types of traders act as market participants in the exchange. High frequency trading, with its complex networks and numerous market participants (intermediary and fundamental traders) poses a difficulty while modelling. It involves the participants to seek the advantage of complex trading algorithms and high execution speeds to carry out large volumes of trades. To earn profits from each trade, the trader must be at the top of the order book quite frequently by executing or processing multiple trades simultaneously. This would require highly automated systems as well as the right sentiment to outperform other traders. However, always being at the top of the book is also not best for the trader, since it was the reason for the outbreak of the ‘Hot – Potato Effect,’ which in turn demands for a better and more efficient model. The characteristics of the model should be such that it should be flexible and have diverse applications. Therefore, a model which has its application in a similar field characterized by such difficulty should be chosen. It should also be flexible in its simulation so that it can be further extended and adapted for future research as well as be equipped with certain tools so that it can be perfectly used in the field of finance. In this case, the Stochastic Pi Calculus model seems to be an ideal fit for financial applications, owing to its expertise in the field of biology. It is an extension of the original Pi Calculus model and acts as a solution and an alternative to the previously flawed algorithm, provided the application of this model is further extended. This model would focus on solving the problem which led to the ‘Flash Crash’ which is the ‘Hot –Potato Effect.’ The model consists of small sub-systems, which can be integrated to form a large system. It is designed in way such that the behavior of ‘noise traders’ is considered as a random process or noise in the system. While modelling, to get a better understanding of the problem, a broader picture is taken into consideration with the trader, the system, and the market participants. The paper goes on to explain trading in exchanges, types of traders, high frequency trading, ‘Flash Crash,’ ‘Hot-Potato Effect,’ evaluation of orders and time delay in further detail. For the future, there is a need to focus on the calibration of the module so that they would interact perfectly with other modules. This model, with its application extended, would provide a basis for researchers for further research in the field of finance and computing.

Keywords: concurrent computing, high frequency trading, financial markets, stochastic pi calculus

Procedia PDF Downloads 57
14295 An Incremental Refinement Approach to a Development of Dynamic Host Configuration Protocol (DHCP) Using Event-B

Authors: Rajaa Filali, Mohamed Bouhdadi

Abstract:

This paper presents an incremental development of the Dynamic Host Configuration Protocol (DHCP) in Event-B. DHCP is widely used communication protocol, which provides a standard mechanism to obtain configuration parameters. The specification is performed in a stepwise manner and verified through a series of refinements. The Event-B formal method uses the Rodin platform to modeling and verifying some properties of the protocol such as safety, liveness and deadlock freedom. To model and verify the protocol, we use the formal technique Event-B which provides an accessible and rigorous development method. This interaction between modelling and proving reduces the complexity and helps to eliminate misunderstandings, inconsistencies, and specification gaps.

Keywords: DHCP protocol, Event-B, refinement, proof obligation, Rodin

Procedia PDF Downloads 206
14294 Modelling of Moisture Loss and Oil Uptake during Deep-Fat Frying of Plantain

Authors: James A. Adeyanju, John O. Olajide, Akinbode A. Adedeji

Abstract:

A predictive mathematical model based on the fundamental principles of mass transfer was developed to simulate the moisture content and oil content during Deep-Fat Frying (DFF) process of dodo. The resulting governing equation, that is, partial differential equation that describes rate of moisture loss and oil uptake was solved numerically using explicit Finite Difference Technique (FDT). Computer codes were written in MATLAB environment for the implementation of FDT at different frying conditions and moisture loss as well as oil uptake simulation during DFF of dodo. Plantain samples were sliced into 5 mm thickness and fried at different frying oil temperatures (150, 160 and 170 ⁰C) for periods varying from 2 to 4 min. The comparison between the predicted results and experimental data for the validation of the model showed reasonable agreement. The correlation coefficients between the predicted and experimental values of moisture and oil transfer models ranging from 0.912 to 0.947 and 0.895 to 0.957, respectively. The predicted results could be further used for the design, control and optimization of deep-fat frying process.

Keywords: frying, moisture loss, modelling, oil uptake

Procedia PDF Downloads 422
14293 Improving Our Understanding of the in vivo Modelling of Psychotic Disorders

Authors: Zsanett Bahor, Cristina Nunes-Fonseca, Gillian L. Currie, Emily S. Sena, Lindsay D.G. Thomson, Malcolm R. Macleod

Abstract:

Psychosis is ranked as the third most disabling medical condition in the world by the World Health Organization. Despite a substantial amount of research in recent years, available treatments are not universally effective and have a wide range of adverse side effects. Since many clinical drug candidates are identified through in vivo modelling, a deeper understanding of these models, and their strengths and limitations, might help us understand reasons for difficulties in psychosis drug development. To provide an unbiased summary of the preclinical psychosis literature we performed a systematic electronic search of PubMed for publications modelling a psychotic disorder in vivo, identifying 14,721 relevant studies. Double screening of 11,000 publications from this dataset so far established 2403 animal studies of psychosis, with the most common model being schizophrenia (95%). 61% of these models are induced using pharmacological agents. For all the models only 56% of publications test a therapeutic treatment. We propose a systematic review of these studies to assess the prevalence of reporting of measures to reduce risk of bias, and a meta-analysis to assess the internal and external validity of these animal models. Our findings are likely to be relevant to future preclinical studies of psychosis as this generation of strong empirical evidence has the potential to identify weaknesses, areas for improvement and make suggestions on refinement of experimental design. Such a detailed understanding of the data which inform what we think we know will help improve the current attrition rate between bench and bedside in psychosis research.

Keywords: animal models, psychosis, systematic review, schizophrenia

Procedia PDF Downloads 270
14292 A Time-Reducible Approach to Compute Determinant |I-X|

Authors: Wang Xingbo

Abstract:

Computation of determinant in the form |I-X| is primary and fundamental because it can help to compute many other determinants. This article puts forward a time-reducible approach to compute determinant |I-X|. The approach is derived from the Newton’s identity and its time complexity is no more than that to compute the eigenvalues of the square matrix X. Mathematical deductions and numerical example are presented in detail for the approach. By comparison with classical approaches the new approach is proved to be superior to the classical ones and it can naturally reduce the computational time with the improvement of efficiency to compute eigenvalues of the square matrix.

Keywords: algorithm, determinant, computation, eigenvalue, time complexity

Procedia PDF Downloads 398
14291 A Design of an Augmented Reality Based Virtual Heritage Application

Authors: Stephen Barnes, Ian Mills, Frances Cleary

Abstract:

Augmented and virtual reality-based applications offer many benefits for the heritage and tourism sector. This technology provides a platform to showcase the regions of interest to people without the need for them to be physically present, which has had a positive impact on enticing tourists to visit those locations. However, the technology also provides the opportunity to present historical artefacts in a form that accurately represents their original, intended appearance. Three sites of interest were identified in the Lingaun Valley in South East Ireland, wherein virtual representations of site-specific artefacts of interest were created via a multidisciplinary team encompassing archaeology, art history, 3D modelling, design, and software development. The collated information has been presented to users via an augmented reality mobile-based application that provides information in an engaging manner that encourages an interest in history as well as visits to the sites in the Lingaun Valley.

Keywords: augmented reality, virtual heritage, 3D modelling, archaeology, virtual representation

Procedia PDF Downloads 61
14290 Enhancing Communicative Skills for Students in Automatics

Authors: Adrian Florin Busu

Abstract:

The communicative approach, or communicative language teaching, used for enhancing communicative skills in students in automatics is a modern teaching approach based on the concept of learning a language through having to communicate real meaning. In the communicative approach, real communication is both the objective of learning and the means through which it takes place. This approach was initiated during the 1970’s and quickly became prominent, as it proposed an alternative to the previous systems-oriented approaches. In other words, instead of focusing on the acquisition of grammar and vocabulary, the communicative approach aims at developing students’ competence to communicate in the target language with an enhanced focus on real-life situations. To put it in an nutshell, CLT considers using the language to be just as important as actually learning the language.

Keywords: communication, approach, objective, learning

Procedia PDF Downloads 136
14289 Centrifuge Modelling Approach on Sysmic Loading Analysis of Clay: A Geotechnical Study

Authors: Anthony Quansah, Tresor Ntaryamira, Shula Mushota

Abstract:

Models for geotechnical centrifuge testing are usually made from re-formed soil, allowing for comparisons with naturally occurring soil deposits. However, there is a fundamental omission in this process because the natural soil is deposited in layers creating a unique structure. Nonlinear dynamics of clay material deposit is an essential part of changing the attributes of ground movements when subjected to solid seismic loading, particularly when diverse intensification conduct of speeding up and relocation are considered. The paper portrays a review of axis shaking table tests and numerical recreations to explore the offshore clay deposits subjected to seismic loadings. These perceptions are accurately reenacted by DEEPSOIL with appropriate soil models and parameters reviewed from noteworthy centrifuge modeling researches. At that point, precise 1-D site reaction investigations are performed on both time and recurrence spaces. The outcomes uncover that for profound delicate clay is subjected to expansive quakes, noteworthy increasing speed lessening may happen close to the highest point of store because of soil nonlinearity and even neighborhood shear disappointment; nonetheless, huge enhancement of removal at low frequencies are normal in any case the forces of base movements, which proposes that for dislodging touchy seaward establishments and structures, such intensified low-recurrence relocation reaction will assume an essential part in seismic outline. This research shows centrifuge as a tool for creating a layered sample important for modelling true soil behaviour (such as permeability) which is not identical in all directions. Currently, there are limited methods for creating layered soil samples.

Keywords: seismic analysis, layered modeling, terotechnology, finite element modeling

Procedia PDF Downloads 135
14288 Currency Exchange Rate Forecasts Using Quantile Regression

Authors: Yuzhi Cai

Abstract:

In this paper, we discuss a Bayesian approach to quantile autoregressive (QAR) time series model estimation and forecasting. Together with a combining forecasts technique, we then predict USD to GBP currency exchange rates. Combined forecasts contain all the information captured by the fitted QAR models at different quantile levels and are therefore better than those obtained from individual models. Our results show that an unequally weighted combining method performs better than other forecasting methodology. We found that a median AR model can perform well in point forecasting when the predictive density functions are symmetric. However, in practice, using the median AR model alone may involve the loss of information about the data captured by other QAR models. We recommend that combined forecasts should be used whenever possible.

Keywords: combining forecasts, MCMC, predictive density functions, quantile forecasting, quantile modelling

Procedia PDF Downloads 237
14287 The Russian Preposition 'за': A Cognitive Linguistic Approach

Authors: M. Kalyuga

Abstract:

Prepositions have long been considered to be one of the major challenges for second language learners, since they have multiple uses that differ greatly from one language to another. The traditional approach to second language teaching supplies students with a list of uses of a preposition that they have to memorise and no explanation is provided. Contrary to the traditional grammar approach, the cognitive linguistic approach offers an explanation for the use of prepositions and provides strategies to comprehend and learn prepositions that would be otherwise seem obscure. The present paper demonstrates the use of the cognitive approach for the explanation of prepositions through the example of the Russian preposition 'за'. The paper demonstrates how various spatial and non-spatial uses of this preposition are linked together through metaphorical and metonymical mapping. The diversity of expressions with за is explained by the range of spatial scenes this preposition is associated with.

Keywords: language teaching, Russian, preposition 'за', cognitive approach

Procedia PDF Downloads 433
14286 Improving Predictions of Coastal Benthic Invertebrate Occurrence and Density Using a Multi-Scalar Approach

Authors: Stephanie Watson, Fabrice Stephenson, Conrad Pilditch, Carolyn Lundquist

Abstract:

Spatial data detailing both the distribution and density of functionally important marine species are needed to inform management decisions. Species distribution models (SDMs) have proven helpful in this regard; however, models often focus only on species occurrences derived from spatially expansive datasets and lack the resolution and detail required to inform regional management decisions. Boosted regression trees (BRT) were used to produce high-resolution SDMs (250 m) at two spatial scales predicting probability of occurrence, abundance (count per sample unit), density (count per km2) and uncertainty for seven coastal seafloor taxa that vary in habitat usage and distribution to examine prediction differences and implications for coastal management. We investigated if small scale regionally focussed models (82,000 km2) can provide improved predictions compared to data-rich national scale models (4.2 million km2). We explored the variability in predictions across model type (occurrence vs abundance) and model scale to determine if specific taxa models or model types are more robust to geographical variability. National scale occurrence models correlated well with broad-scale environmental predictors, resulting in higher AUC (Area under the receiver operating curve) and deviance explained scores; however, they tended to overpredict in the coastal environment and lacked spatially differentiated detail for some taxa. Regional models had lower overall performance, but for some taxa, spatial predictions were more differentiated at a localised ecological scale. National density models were often spatially refined and highlighted areas of ecological relevance producing more useful outputs than regional-scale models. The utility of a two-scale approach aids the selection of the most optimal combination of models to create a spatially informative density model, as results contrasted for specific taxa between model type and scale. However, it is vital that robust predictions of occurrence and abundance are generated as inputs for the combined density model as areas that do not spatially align between models can be discarded. This study demonstrates the variability in SDM outputs created over different geographical scales and highlights implications and opportunities for managers utilising these tools for regional conservation, particularly in data-limited environments.

Keywords: Benthic ecology, spatial modelling, multi-scalar modelling, marine conservation.

Procedia PDF Downloads 53
14285 The Use of Rule-Based Cellular Automata to Track and Forecast the Dispersal of Classical Biocontrol Agents at Scale, with an Application to the Fopius arisanus Fruit Fly Parasitoid

Authors: Agboka Komi Mensah, John Odindi, Elfatih M. Abdel-Rahman, Onisimo Mutanga, Henri Ez Tonnang

Abstract:

Ecosystems are networks of organisms and populations that form a community of various species interacting within their habitats. Such habitats are defined by abiotic and biotic conditions that establish the initial limits to a population's growth, development, and reproduction. The habitat’s conditions explain the context in which species interact to access resources such as food, water, space, shelter, and mates, allowing for feeding, dispersal, and reproduction. Dispersal is an essential life-history strategy that affects gene flow, resource competition, population dynamics, and species distributions. Despite the importance of dispersal in population dynamics and survival, understanding the mechanism underpinning the dispersal of organisms remains challenging. For instance, when an organism moves into an ecosystem for survival and resource competition, its progression is highly influenced by extrinsic factors such as its physiological state, climatic variables and ability to evade predation. Therefore, greater spatial detail is necessary to understand organism dispersal dynamics. Understanding organisms dispersal can be addressed using empirical and mechanistic modelling approaches, with the adopted approach depending on the study's purpose Cellular automata (CA) is an example of these approaches that have been successfully used in biological studies to analyze the dispersal of living organisms. Cellular automata can be briefly described as occupied cells by an individual that evolves based on proper decisions based on a set of neighbours' rules. However, in the ambit of modelling individual organisms dispersal at the landscape scale, we lack user friendly tools that do not require expertise in mathematical models and computing ability; such as a visual analytics framework for tracking and forecasting the dispersal behaviour of organisms. The term "visual analytics" (VA) describes a semiautomated approach to electronic data processing that is guided by users who can interact with data via an interface. Essentially, VA converts large amounts of quantitative or qualitative data into graphical formats that can be customized based on the operator's needs. Additionally, this approach can be used to enhance the ability of users from various backgrounds to understand data, communicate results, and disseminate information across a wide range of disciplines. To support effective analysis of the dispersal of organisms at the landscape scale, we therefore designed Pydisp which is a free visual data analytics tool for spatiotemporal dispersal modeling built in Python. Its user interface allows users to perform a quick and interactive spatiotemporal analysis of species dispersal using bioecological and climatic data. Pydisp enables reuse and upgrade through the use of simple principles such as Fuzzy cellular automata algorithms. The potential of dispersal modeling is demonstrated in a case study by predicting the dispersal of Fopius arisanus (Sonan), endoparasitoids to control Bactrocera dorsalis (Hendel) (Diptera: Tephritidae) in Kenya. The results obtained from our example clearly illustrate the parasitoid's dispersal process at the landscape level and confirm that dynamic processes in an agroecosystem are better understood when designed using mechanistic modelling approaches. Furthermore, as demonstrated in the example, the built software is highly effective in portraying the dispersal of organisms despite the unavailability of detailed data on the species dispersal mechanisms.

Keywords: cellular automata, fuzzy logic, landscape, spatiotemporal

Procedia PDF Downloads 60
14284 Optimal Placement and Sizing of Distributed Generation in Microgrid for Power Loss Reduction and Voltage Profile Improvement

Authors: Ferinar Moaidi, Mahdi Moaidi

Abstract:

Environmental issues and the ever-increasing in demand of electrical energy make it necessary to have distributed generation (DG) resources in the power system. In this research, in order to realize the goals of reducing losses and improving the voltage profile in a microgrid, the allocation and sizing of DGs have been used. The proposed Genetic Algorithm (GA) is described from the array of artificial intelligence methods for solving the problem. The algorithm is implemented on the IEEE 33 buses network. This study is presented in two scenarios, primarily to illustrate the effect of location and determination of DGs has been done to reduce losses and improve the voltage profile. On the other hand, decisions made with the one-level assumptions of load are not universally accepted for all levels of load. Therefore, in this study, load modelling is performed and the results are presented for multi-levels load state.

Keywords: distributed generation, genetic algorithm, microgrid, load modelling, loss reduction, voltage improvement

Procedia PDF Downloads 130
14283 Peril´s Environment of Energetic Infrastructure Complex System, Modelling by the Crisis Situation Algorithms

Authors: Jiří F. Urbánek, Alena Oulehlová, Hana Malachová, Jiří J. Urbánek Jr.

Abstract:

Crisis situations investigation and modelling are introduced and made within the complex system of energetic critical infrastructure, operating on peril´s environments. Every crisis situations and perils has an origin in the emergency/ crisis event occurrence and they need critical/ crisis interfaces assessment. Here, the emergency events can be expected - then crisis scenarios can be pre-prepared by pertinent organizational crisis management authorities towards their coping; or it may be unexpected - without pre-prepared scenario of event. But the both need operational coping by means of crisis management as well. The operation, forms, characteristics, behaviour and utilization of crisis management have various qualities, depending on real critical infrastructure organization perils, and prevention training processes. An aim is always - better security and continuity of the organization, which successful obtainment needs to find and investigate critical/ crisis zones and functions in critical infrastructure organization models, operating in pertinent perils environment. Our DYVELOP (Dynamic Vector Logistics of Processes) method is disposables for it. Here, it is necessary to derive and create identification algorithm of critical/ crisis interfaces. The locations of critical/ crisis interfaces are the flags of crisis situation in organization of critical infrastructure models. Then, the model of crisis situation will be displayed at real organization of Czech energetic crisis infrastructure subject in real peril environment. These efficient measures are necessary for the infrastructure protection. They will be derived for peril mitigation, crisis situation coping and for environmentally friendly organization survival, continuity and its sustainable development advanced possibilities.

Keywords: algorithms, energetic infrastructure complex system, modelling, peril´s environment

Procedia PDF Downloads 384
14282 Modelling of Atomic Force Microscopic Nano Robot's Friction Force on Rough Surfaces

Authors: M. Kharazmi, M. Zakeri, M. Packirisamy, J. Faraji

Abstract:

Micro/Nanorobotics or manipulation of nanoparticles by Atomic Force Microscopic (AFM) is one of the most important solutions for controlling the movement of atoms, particles and micro/nano metrics components and assembling of them to design micro/nano-meter tools. Accurate modelling of manipulation requires identification of forces and mechanical knowledge in the Nanoscale which are different from macro world. Due to the importance of the adhesion forces and the interaction of surfaces at the nanoscale several friction models were presented. In this research, friction and normal forces that are applied on the AFM by using of the dynamic bending-torsion model of AFM are obtained based on Hurtado-Kim friction model (HK), Johnson-Kendall-Robert contact model (JKR) and Greenwood-Williamson roughness model (GW). Finally, the effect of standard deviation of asperities height on the normal load, friction force and friction coefficient are studied.

Keywords: atomic force microscopy, contact model, friction coefficient, Greenwood-Williamson model

Procedia PDF Downloads 180
14281 CFD Study of Subcooled Boiling Flow at Elevated Pressure Using a Mechanistic Wall Heat Partitioning Model

Authors: Machimontorn Promtong, Sherman C. P. Cheung, Guan H. Yeoh, Sara Vahaji, Jiyuan Tu

Abstract:

The wide range of industrial applications involved with boiling flows promotes the necessity of establishing fundamental knowledge in boiling flow phenomena. For this purpose, a number of experimental and numerical researches have been performed to elucidate the underlying physics of this flow. In this paper, the improved wall boiling models, implemented on ANSYS CFX 14.5, were introduced to study subcooled boiling flow at elevated pressure. At the heated wall boundary, the Fractal model, Force balance approach and Mechanistic frequency model are given for predicting the nucleation site density, bubble departure diameter, and bubble departure frequency. The presented wall heat flux partitioning closures were modified to consider the influence of bubble sliding along the wall before the lift-off, which usually happens in the flow boiling. The simulation was performed based on the Two-fluid model, where the standard k-ω SST model was selected for turbulence modelling. Existing experimental data at around 5 bars were chosen to evaluate the accuracy of the presented mechanistic approach. The void fraction and Interfacial Area Concentration (IAC) are in good agreement with the experimental data. However, the predicted bubble velocity and Sauter Mean Diameter (SMD) are over-predicted. This over-prediction may be caused by consideration of only dispersed and spherical bubbles in the simulations. In the future work, the important physical mechanisms of bubbles, such as merging and shrinking during sliding on the heated wall will be incorporated into this mechanistic model to enhance its capability for a wider range of flow prediction.

Keywords: subcooled boiling flow, computational fluid dynamics (CFD), mechanistic approach, two-fluid model

Procedia PDF Downloads 296
14280 Kirchhoff’s Depth Migration over Heterogeneous Velocity Models with Ray Tracing Modeling Approach

Authors: Alok Kumar Routa, Priya Ranjan Mohanty

Abstract:

Complex seismic signatures are generated due to the complexity of the subsurface which is difficult to interpret. In the present study, an attempt has been made to model the complex subsurface using the Ray tracing modeling technique. Add to this, for the imaging of these geological features, Kirchhoff’s prestack depth migration is applied over the synthetic common shot gather dataset. It is found that the Kirchhoff’s migration technique in addition with the Ray tracing modeling concept has the flexibility towards the imaging of various complex geology which gives satisfactory results with proper delineation of the reflectors at their respective true depth position. The entire work has been carried out under the MATLAB environment.

Keywords: Kirchhoff's migration, Prestack depth migration, Ray tracing modelling, velocity model

Procedia PDF Downloads 340
14279 Crude Oil Electrostatic Mathematical Modelling on an Existing Industrial Plant

Authors: Fatemeh Yazdanmehr, Iulian Nistor

Abstract:

The scope of the current study is the prediction of water separation in a two-stage industrial crude oil desalting plant. This research study was focused on developing a desalting operation in an existing production unit of one Iranian heavy oil field with 75 MBPD capacity. Because of some operational issues, such as oil dehydration at high temperatures, the optimization of the desalter operational parameters was essential. The mathematical desalting is modeled based on the population balance method. The existing operational data is used for tuning and validation of the accuracy of the modeling. The inlet oil temperature to desalter used was decreased from 110°C to 80°C, and the desalted electrical field was increased from 0.75 kv to 2.5 kv. The proposed condition for the desalter also meets the water oil specification. Based on these conditions of desalter, the oil recovery is increased by 574 BBL/D, and the gas flaring decrease by 2.8 MMSCF/D. Depending on the oil price, the additional production of oil can increase the annual income by about $15 MM and reduces greenhouse gas production caused by gas flaring.

Keywords: desalter, demulsification, modelling, water-oil separation, crude oil emulsion

Procedia PDF Downloads 49
14278 Customer Data Analysis Model Using Business Intelligence Tools in Telecommunication Companies

Authors: Monica Lia

Abstract:

This article presents a customer data analysis model using business intelligence tools for data modelling, transforming, data visualization and dynamic reports building. Economic organizational customer’s analysis is made based on the information from the transactional systems of the organization. The paper presents how to develop the data model starting for the data that companies have inside their own operational systems. The owned data can be transformed into useful information about customers using business intelligence tool. For a mature market, knowing the information inside the data and making forecast for strategic decision become more important. Business Intelligence tools are used in business organization as support for decision-making.

Keywords: customer analysis, business intelligence, data warehouse, data mining, decisions, self-service reports, interactive visual analysis, and dynamic dashboards, use cases diagram, process modelling, logical data model, data mart, ETL, star schema, OLAP, data universes

Procedia PDF Downloads 410
14277 Modelling of Pipe Jacked Twin Tunnels in a Very Soft Clay

Authors: Hojjat Mohammadi, Randall Divito, Gary J. E. Kramer

Abstract:

Tunnelling and pipe jacking in very soft soils (fat clays), even with an Earth Pressure Balance tunnel boring machine (EPBM), can cause large ground displacements. In this study, the short-term and long-term ground and tunnel response is predicted for twin, pipe-jacked EPBM 3 meter diameter tunnels with a narrow pillar width. Initial modelling indicated complete closure of the annulus gap at the tail shield onto the centrifugally cast, glass-fiber-reinforced, polymer mortar jacking pipe (FRP). Numerical modelling was employed to simulate the excavation and support installation sequence, examine the ground response during excavation, confirm the adequacy of the pillar width and check the structural adequacy of the installed pipe. In the numerical models, Mohr-Coulomb constitutive model with the effect of unloading was adopted for the fat clays, while for the bedrock layer, the generalized Hoek-Brown was employed. The numerical models considered explicit excavation sequences and different levels of ground convergence prior to support installation. The well-studied excavation sequences made the analysis possible for this study on a very soft clay, otherwise, obtaining the convergency in the numerical analysis would be impossible. The predicted results indicate that the ground displacements around the tunnel and its effect on the pipe would be acceptable despite predictions of large zones of plastic behaviour around the tunnels and within the entire pillar between them due to excavation-induced ground movements.

Keywords: finite element modeling (FEM), pipe-jacked tunneling, very soft clay, EPBM

Procedia PDF Downloads 65
14276 Edmonton Urban Growth Model as a Support Tool for the City Plan Growth Scenarios Development

Authors: Sinisa J. Vukicevic

Abstract:

Edmonton is currently one of the youngest North American cities and has achieved significant growth over the past 40 years. Strong urban shift requires a new approach to how the city is envisioned, planned, and built. This approach is evidence-based scenario development, and an urban growth model was a key support tool in framing Edmonton development strategies, developing urban policies, and assessing policy implications. The urban growth model has been developed using the Metronamica software platform. The Metronamica land use model evaluated the dynamic of land use change under the influence of key development drivers (population and employment), zoning, land suitability, and land and activity accessibility. The model was designed following the Big City Moves ideas: become greener as we grow, develop a rebuildable city, ignite a community of communities, foster a healing city, and create a city of convergence. The Big City Moves were converted to three development scenarios: ‘Strong Central City’, ‘Node City’, and ‘Corridor City’. Each scenario has a narrative story that expressed scenario’s high level goal, scenario’s approach to residential and commercial activities, to transportation vision, and employment and environmental principles. Land use demand was calculated for each scenario according to specific density targets. Spatial policies were analyzed according to their level of importance within the policy set definition for the specific scenario, but also through the policy measures. The model was calibrated on the way to reproduce known historical land use pattern. For the calibration, we used 2006 and 2011 land use data. The validation is done independently, which means we used the data we did not use for the calibration. The model was validated with 2016 data. In general, the modeling process contain three main phases: ‘from qualitative storyline to quantitative modelling’, ‘model development and model run’, and ‘from quantitative modelling to qualitative storyline’. The model also incorporates five spatial indicators: distance from residential to work, distance from residential to recreation, distance to river valley, urban expansion and habitat fragmentation. The major finding of this research could be looked at from two perspectives: the planning perspective and technology perspective. The planning perspective evaluates the model as a tool for scenario development. Using the model, we explored the land use dynamic that is influenced by a different set of policies. The model enables a direct comparison between the three scenarios. We explored the similarities and differences of scenarios and their quantitative indicators: land use change, population change (and spatial allocation), job allocation, density (population, employment, and dwelling unit), habitat connectivity, proximity to objects of interest, etc. From the technology perspective, the model showed one very important characteristic: the model flexibility. The direction for policy testing changed many times during the consultation process and model flexibility in applying all these changes was highly appreciated. The model satisfied our needs as scenario development and evaluation tool, but also as a communication tool during the consultation process.

Keywords: urban growth model, scenario development, spatial indicators, Metronamica

Procedia PDF Downloads 76
14275 Design and Optimization of a Customized External Fixation Device for Lower Limb Injuries

Authors: Mohammed S. Alqahtani, Paulo J. Bartolo

Abstract:

External fixation is a common technique for the treatment and stabilization of bone fractures. Different designs have been proposed by companies and research groups, but all of them present limitations such as high weight, not comfortable to use, and not customized to individual patients. This paper proposes a lightweight customized external fixator, overcoming some of these limitations. External fixators are designed using a set of techniques such as medical imaging, CAD modelling, finite element analysis, and full factorial design of experiments. Key design parameters are discussed, and the optimal set of parameters is used to design the final external fixator. Numerical simulations are used to validate design concepts. Results present an optimal external fixation design with weight reduction of 13% without compromising its stiffness and structural integrity. External fixators are also designed to be additively manufactured, allowing to develop a strategy for personalization.

Keywords: computer-aided design modelling, external fixation, finite element analysis, full factorial, personalization

Procedia PDF Downloads 140
14274 A Review of Transformer Modeling for Power Line Communication Applications

Authors: Balarabe Nkom, Adam P. R. Taylor, Craig Baguley

Abstract:

Power Line Communications (PLC) is being employed in existing power systems, despite the infrastructure not being designed with PLC considerations in mind. Given that power transformers can last for decades, the distribution transformer in particular exists as a relic of un-optimized technology. To determine issues that may need to be addressed in subsequent designs of such transformers, it is essential to have a highly accurate transformer model for simulations and subsequent optimization for the PLC environment, with a view to increase data speed, throughput, and efficiency, while improving overall system stability and reliability. This paper reviews various methods currently available for creating transformer models and provides insights into the requirements of each for obtaining high accuracy. The review indicates that a combination of traditional analytical methods using a hybrid approach gives good accuracy at reasonable costs.

Keywords: distribution transformer, modelling, optimization, power line communications

Procedia PDF Downloads 489
14273 Application of Building Information Modelling In Analysing IGBC® Ratings (Sustainability Analyses)

Authors: Lokesh Harshe

Abstract:

The building construction sector is using 36% of global energy consumption with 39% of CO₂ emission. Professionals in the Built Environment Sector have long been aware of the industry’s contribution towards CO₂ emissions and are now moving towards more sustainable practices. As a result of this, many organizations have introduced rating systems to address the issue of global warming in the construction sector by ranking construction projects based on sustainability parameters. The pre-construction phase of any building project is the most essential time to make decisions for addressing the sustainability aspects. Traditionally, it is very difficult to collect data from different stakeholders and bring it together to form a decision based on factual data to perform sustainability analyses in the pre-construction phase. Building Information Modelling (BIM) is the solution where one single model is the result of the collaborative approach of BIM processes where all the information is shared, extracted, communicated, and stored on a single platform that everyone can access and make decisions based on real-time data. The focus of this research is on the Indian Green Rating System IGBC® with the objective of understanding IGBC® requirements and developing a framework to create the relationship between the rating processes and BIM. A Hypothetical (Architectural) model of a hostel building is developed using AutoCAD 2019 & Revit Arch. 2019, where the framework is applied to generate results on sustainability analysis using Green Building Studio (GBS) and Revit Add-ins. The results of any sustainability analysis are generated within a fraction of a minute, which is very quick in comparison with traditional sustainability analysis. This may save a considerable amount of time as well as cost. The future scope is to integrate Architectural, Structural, and MEP Models to perform accurate sustainability analyses with inputs from industry professionals working on real-life Green BIM projects.

Keywords: sustainability analyses, BIM, green rating systems, IGBC®, LEED

Procedia PDF Downloads 29
14272 Modelling and Simulation of Bioethanol Production from Food Waste Using CHEMCAD Software

Authors: Kgomotso Matobole, Noluzuko Monakali, Hilary Rutto, Tumisang Seodigeng

Abstract:

On a global scale, there is an alarming generation of food waste. Food waste is generated across the food supply chain. Worldwide urbanization, as well as global economic growth, have contributed to this amount of food waste the environment is receiving. Food waste normally ends on illegal dumping sites when not properly disposed, or disposed to landfills. This results in environmental pollution due to inadequate waste management practices. Food waste is rich in organic matter and highly biodegradable; hence, it can be utilized for the production of bioethanol, a type of biofuel. In so doing, alternative energy will be created, and the volumes of food waste will be reduced in the process. This results in food waste being seen as a precious commodity in energy generation instead of a pollutant. The main aim of the project was to simulate a biorefinery, using a software called CHEMCAD 7.12. The resulting purity of the ethanol from the simulation was 98.9%, with the feed ratio of 1: 2 for food waste and water. This was achieved by integrating necessary unit operations and optimisation of their operating conditions.

Keywords: fermentation, bioethanol, food waste, hydrolysis, simulation, modelling

Procedia PDF Downloads 319