Search results for: beyond the standard model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20134

Search results for: beyond the standard model

20014 The Use of Performance Indicators for Evaluating Models of Drying Jackfruit (Artocarpus heterophyllus L.): Page, Midilli, and Lewis

Authors: D. S. C. Soares, D. G. Costa, J. T. S., A. K. S. Abud, T. P. Nunes, A. M. Oliveira Júnior

Abstract:

Mathematical models of drying are used for the purpose of understanding the drying process in order to determine important parameters for design and operation of the dryer. The jackfruit is a fruit with high consumption in the Northeast and perishability. It is necessary to apply techniques to improve their conservation for longer in order to diffuse it by regions with low consumption. This study aimed to analyse several mathematical models (Page, Lewis, and Midilli) to indicate one that best fits the conditions of convective drying process using performance indicators associated with each model: accuracy (Af) and noise factors (Bf), mean square error (RMSE) and standard error of prediction (% SEP). Jackfruit drying was carried out in convective type tray dryer at a temperature of 50°C for 9 hours. It is observed that the model Midili was more accurate with Af: 1.39, Bf: 1.33, RMSE: 0.01%, and SEP: 5.34. However, the use of the Model Midilli is not appropriate for purposes of control process due to need four tuning parameters. With the performance indicators used in this paper, the Page model showed similar results with only two parameters. It is concluded that the best correlation between the experimental and estimated data is given by the Page’s model.

Keywords: drying, models, jackfruit, biotechnology

Procedia PDF Downloads 347
20013 Overview of Standard Unit System of Shenzhen Land Spatial Planning and Case Analysis

Authors: Ziwei Huang

Abstract:

The standard unit of Shenzhen land spatial planning has the characteristics of vertical conduction, horizontal evaluation, internal balance and supervision of implementation. It mainly assumes the role of geospatial unit, assists in promoting the complex development of the business in Shenzhen and undertakes the management and transmission of upper and lower levels of planning as well as the Urban management functions such as gap analysis of public facilities, planning evaluation and dynamic monitoring of planning information. Combining with the application examples of the analysis of gaps in public facilities in Longgang District, it can be found that the standard unit of land spatial planning in Shenzhen as a small-scale geographic basic unit, has a stronger urban spatial coupling effect. However, the universality of the application of the system is still lacking and it is necessary to propose more scientific and powerful standard unit delineation standards and planning function evaluation indicators to guide the implementation of the system's popularization and application.

Keywords: Shenzhen city, land spatial planning, standard unit system, urban delicacy management

Procedia PDF Downloads 93
20012 Hybrid Equity Warrants Pricing Formulation under Stochastic Dynamics

Authors: Teh Raihana Nazirah Roslan, Siti Zulaiha Ibrahim, Sharmila Karim

Abstract:

A warrant is a financial contract that confers the right but not the obligation, to buy or sell a security at a certain price before expiration. The standard procedure to value equity warrants using call option pricing models such as the Black–Scholes model had been proven to contain many flaws, such as the assumption of constant interest rate and constant volatility. In fact, existing alternative models were found focusing more on demonstrating techniques for pricing, rather than empirical testing. Therefore, a mathematical model for pricing and analyzing equity warrants which comprises stochastic interest rate and stochastic volatility is essential to incorporate the dynamic relationships between the identified variables and illustrate the real market. Here, the aim is to develop dynamic pricing formulations for hybrid equity warrants by incorporating stochastic interest rates from the Cox-Ingersoll-Ross (CIR) model, along with stochastic volatility from the Heston model. The development of the model involves the derivations of stochastic differential equations that govern the model dynamics. The resulting equations which involve Cauchy problem and heat equations are then solved using partial differential equation approaches. The analytical pricing formulas obtained in this study comply with the form of analytical expressions embedded in the Black-Scholes model and other existing pricing models for equity warrants. This facilitates the practicality of this proposed formula for comparison purposes and further empirical study.

Keywords: Cox-Ingersoll-Ross model, equity warrants, Heston model, hybrid models, stochastic

Procedia PDF Downloads 84
20011 3D CFD Modelling of the Airflow and Heat Transfer in Cold Room Filled with Dates

Authors: Zina Ghiloufi, Tahar Khir

Abstract:

A transient three-dimensional computational fluid dynamics (CFD) model is developed to determine the velocity and temperature distribution in different positions cold room during pre-cooling of dates. The turbulence model used is the k-ω Shear Stress Transport (SST) with the standard wall function, the air. The numerical results obtained show that cooling rate is not uniform inside the room; the product at the medium of room has a slower cooling rate. This cooling heterogeneity has a large effect on the energy consumption during cold storage.

Keywords: CFD, cold room, cooling rate, dDates, numerical simulation, k-ω (SST)

Procedia PDF Downloads 202
20010 Housing Price Dynamics: Comparative Study of 1980-1999 and the New Millenium

Authors: Janne Engblom, Elias Oikarinen

Abstract:

The understanding of housing price dynamics is of importance to a great number of agents: to portfolio investors, banks, real estate brokers and construction companies as well as to policy makers and households. A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models is dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Common Correlated Effects estimator (CCE) of dynamic panel data which also accounts for cross-sectional dependence which is caused by common structures of the economy. In presence of cross-sectional dependence standard OLS gives biased estimates. In this study, U.S housing price dynamics were examined empirically using the dynamic CCE estimator with first-difference of housing price as the dependent and first-differences of per capita income, interest rate, housing stock and lagged price together with deviation of housing prices from their long-run equilibrium level as independents. These deviations were also estimated from the data. The aim of the analysis was to provide estimates with comparisons of estimates between 1980-1999 and 2000-2012. Based on data of 50 U.S cities over 1980-2012 differences of short-run housing price dynamics estimates were mostly significant when two time periods were compared. Significance tests of differences were provided by the model containing interaction terms of independents and time dummy variable. Residual analysis showed very low cross-sectional correlation of the model residuals compared with the standard OLS approach. This means a good fit of CCE estimator model. Estimates of the dynamic panel data model were in line with the theory of housing price dynamics. Results also suggest that dynamics of a housing market is evolving over time.

Keywords: dynamic model, panel data, cross-sectional dependence, interaction model

Procedia PDF Downloads 228
20009 Islamic Financial Instrument, Standard Parallel Salam as an Alternative to Conventional Derivatives

Authors: Alireza Naserpoor

Abstract:

Derivatives are the most important innovation which has happened in the past decades. When it comes to financial markets, it has changed the whole way of operations of stock, commodities and currency market. Beside a lot of advantages, Conventional derivatives contracts have some disadvantages too. Some problems have been caused by derivatives contain raising Volatility, increasing Bankruptcies and causing financial crises. Standard Parallel Salam contract as an Islamic financial product meanwhile is a financing instrument can be used for risk management by investors. Standard Parallel Salam is a Shari’ah-Compliant contract. Furthermore, it is an alternative to conventional derivatives. Despite the fact that the unstructured types of that, has been used in several Islamic countries, This contract as a structured and standard financial instrument introduced in Iran Mercantile Exchange in 2014. In this paper after introducing parallel Salam, we intend to examine a collection of international experience and local measure regarding launching standard parallel Salam contract and proceed to describe standard scenarios for trading this instrument and practical experience in Iran Mercantile Exchange about this instrument. Afterwards, we make a comparison between SPS and Futures contracts as a conventional derivative. Standard parallel salam contract as an Islamic financial product, can be used for risk management by investors. SPS is a Shariah-Compliant contract. Furthermore it is an alternative to conventional derivatives. This contract as a structured and standard financial instrument introduced in Iran Mercantile Exchange in 2014. despite the fact that the unstructured types of that, has been used in several Islamic countries. In this article after introducing parallel salam, we intend to examine a collection of international experience and local measure regarding launching standard parallel salam contract and proceed to describe standard scenarios for trading this instrument containing two main approaches in SPS using, And practical experience in IME about this instrument Afterwards, a comparison between SPS and Futures contracts as a conventional derivatives.

Keywords: futures contracts, hedging, shari’ah compliant instruments, standard parallel salam

Procedia PDF Downloads 348
20008 A Simple Finite Element Method for Glioma Tumor Growth Model with Density Dependent Diffusion

Authors: Shangerganesh Lingeshwaran

Abstract:

In this presentation, we have performed numerical simulations for a reaction-diffusion equation with various nonlinear density-dependent diffusion operators and proliferation functions. The mathematical model represented by parabolic partial differential equation is considered to study the invasion of gliomas (the most common type of brain tumors) and to describe the growth of cancer cells and response to their treatment. The unknown quantity of the given reaction-diffusion equation is the density of cancer cells and the mathematical model based on the proliferation and migration of glioma cells. A standard Galerkin finite element method is used to perform the numerical simulations of the given model. Finally, important observations on the each of nonlinear diffusion functions and proliferation functions are presented with the help of computational results.

Keywords: glioma invasion, nonlinear diffusion, reaction-diffusion, finite eleament method

Procedia PDF Downloads 199
20007 Intrapreneurship Discovery: Standard Strategy to Boost Innovation inside Companies

Authors: Chiara Mansanta, Daniela Sani

Abstract:

This paper studies the concept of intrapreneurship discovery for innovation and technology development related to the manufacturing industries set up in the center of Italy, in Marche Region. The study underlined the key drivers of the innovation process and the main factors that influence innovation. Starting from a literature study on open innovation, this paper examines the role of human capital to support company’s development. The empirical part of the study is based on a survey to 151 manufacturing companies that represent the 34% of that universe at the regional level. The survey underlined the main KPI’s that influence companies in their decision processes; then tools for these decision processes are presented.

Keywords: business model, decision making, intrapreneurship discovery, standard methodology

Procedia PDF Downloads 148
20006 Development of Standard Thai Appetizer in Rattanakosin Era‘s Standard: Case Study of Thai Steamed Dumpling

Authors: Nunyong Fuengkajornfung, Pattama Hirunyophat, Tidarat Sanphom

Abstract:

The objectives of this research were: To study of the recipe standard of Thai steamed dumpling, to study the ratio of modified starch in Thai steamed dumpling, to study chemical elements analyzing and Escherichia coli in Thai steamed dumpling. The experimental processes were designed in two stages as follows: To study the recipe standard of Thai steamed dumpling and to study the ratio of rice flour: modify starch by three levels 90:10, 73:30, and 50:50. The evaluation test used 9 Points Hedonic Scale method by the sensory evaluation test such as color, smell, taste, texture and overall liking. An experimental by Randomized Complete Block Design (RCBD). The statistics used in data analyses were means, standard deviation, one-way ANOVA and Duncan’s New Multiple Range Test. Regression equation, at a statistically significant level of .05. The results showed that the recipe standard was studied from three recipes by the sensory evaluation test such as color, odor, taste, spicy, texture and total acceptance. The result showed that the recipe standard of second was suitably to development. The ratio of rice flour: modified starch had 3 levels 90:10, 73:30, and 50:50 which the process condition of 50:50 had well scores (like moderately to like very much; used 9 Points Hedonic Scale method for the sensory test). Chemical elements analyzing, it showed that moisture 58.63%, fat 5.45%, protein 4.35%, carbohydrate 30.45%, and Ash 1.12%. The Escherichia coli is not found in lab testing.

Keywords: Thai snack in Rattanakosin era, Thai steamed dumpling, modify starch, recipe standard

Procedia PDF Downloads 297
20005 Modeling of Turbulent Flow for Two-Dimensional Backward-Facing Step Flow

Authors: Alex Fedoseyev

Abstract:

This study investigates a generalized hydrodynamic equation (GHE) simplified model for the simulation of turbulent flow over a two-dimensional backward-facing step (BFS) at Reynolds number Re=132000. The GHE were derived from the generalized Boltzmann equation (GBE). GBE was obtained by first principles from the chain of Bogolubov kinetic equations and considers particles of finite dimensions. The GHE has additional terms, temporal and spatial fluctuations, compared to the Navier-Stokes equations (NSE). These terms have a timescale multiplier τ, and the GHE becomes the NSE when $\tau$ is zero. The nondimensional τ is a product of the Reynolds number and the squared length scale ratio, τ=Re*(l/L)², where l is the apparent Kolmogorov length scale, and L is a hydrodynamic length scale. The BFS flow modeling results obtained by 2D calculations cannot match the experimental data for Re>450. One or two additional equations are required for the turbulence model to be added to the NSE, which typically has two to five parameters to be tuned for specific problems. It is shown that the GHE does not require an additional turbulence model, whereas the turbulent velocity results are in good agreement with the experimental results. A review of several studies on the simulation of flow over the BFS from 1980 to 2023 is provided. Most of these studies used different turbulence models when Re>1000. In this study, the 2D turbulent flow over a BFS with height H=L/3 (where L is the channel height) at Reynolds number Re=132000 was investigated using numerical solutions of the GHE (by a finite-element method) and compared to the solutions from the Navier-Stokes equations, k–ε turbulence model, and experimental results. The comparison included the velocity profiles at X/L=5.33 (near the end of the recirculation zone, available from the experiment), recirculation zone length, and velocity flow field. The mean velocity of NSE was obtained by averaging the solution over the number of time steps. The solution with a standard k −ε model shows a velocity profile at X/L=5.33, which has no backward flow. A standard k−ε model underpredicts the experimental recirculation zone length X/L=7.0∓0.5 by a substantial amount of 20-25%, and a more sophisticated turbulence model is needed for this problem. The obtained data confirm that the GHE results are in good agreement with the experimental results for turbulent flow over two-dimensional BFS. A turbulence model was not required in this case. The computations were stable. The solution time for the GHE is the same or less than that for the NSE and significantly less than that for the NSE with the turbulence model. The proposed approach was limited to 2D and only one Reynolds number. Further work will extend this approach to 3D flow and a higher Re.

Keywords: backward-facing step, comparison with experimental data, generalized hydrodynamic equations, separation, reattachment, turbulent flow

Procedia PDF Downloads 22
20004 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data

Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer

Abstract:

This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.

Keywords: non-stationary, BINARMA(1, 1) model, Poisson innovations, conditional maximum likelihood, CML

Procedia PDF Downloads 100
20003 Performance Gap and near Zero Energy Buildings Compliance of Monitored Passivhaus in Northern Ireland, the Republic of Ireland and Italy

Authors: S. Colclough, V. Costanzo, K. Fabbri, S. Piraccini, P. Griffiths

Abstract:

The near Zero Energy Building (nZEB) standard is required for all buildings from 2020. The Passive House (PH) standard is a well-established low-energy building standard, having been designed over 25 years ago, and could potentially be used to achieve the nZEB standard in combination with renewables. By comparing measured performance with design predictions, this paper considers if there is a performance gap for a number of monitored properties and assesses if the nZEB standard can be achieved by following the well-established PH scheme. Analysis is carried out based on monitoring results from real buildings located in Northern Ireland, the Republic of Ireland and Italy respectively, with particular focus on the indoor air quality including the assumed and measured indoor temperature and heating periods for both standards as recorded during a full annual cycle. An analysis is carried out also on the energy performance certificates of each of the dwellings to determine if they meet the near Zero Energy Buildings primary energy consumption targets set in the respective jurisdictions. Each of the dwellings is certified as complying with the passive house standard, and accordingly have very good insulation levels, heat recovery and ventilation systems of greater than 75% efficiency and an airtightness of less than 0.6 air changes per hour at 50 Pa. It is found that indoor temperature and relative humidity were within the comfort boundaries set in the design stage, while carbon dioxide concentrations are sometimes higher than the values suggested by EN 15251 Standard for comfort class I especially in bedrooms.

Keywords: monitoring campaign, nZEB (near zero energy buildings), Passivhaus, performance gap

Procedia PDF Downloads 124
20002 Elastoplastic and Ductile Damage Model Calibration of Steels for Bolt-Sphere Joints Used in China’s Space Structure Construction

Authors: Huijuan Liu, Fukun Li, Hao Yuan

Abstract:

The bolted spherical node is a common type of joint in space steel structures. The bolt-sphere joint portion almost always controls the bearing capacity of the bolted spherical node. The investigation of the bearing performance and progressive failure in service often requires high-fidelity numerical models. This paper focuses on the constitutive models of bolt steel and sphere steel used in China’s space structure construction. The elastoplastic model is determined by a standard tensile test and calibrated Voce saturated hardening rule. The ductile damage is found dominant based on the fractography analysis. Then Rice-Tracey ductile fracture rule is selected and the model parameters are calibrated based on tensile tests of notched specimens. These calibrated material models can benefit research or engineering work in similar fields.

Keywords: bolt-sphere joint, steel, constitutive model, ductile damage, model calibration

Procedia PDF Downloads 109
20001 Predicting the Exposure Level of Airborne Contaminants in Occupational Settings via the Well-Mixed Room Model

Authors: Alireza Fallahfard, Ludwig Vinches, Stephane Halle

Abstract:

In the workplace, the exposure level of airborne contaminants should be evaluated due to health and safety issues. It can be done by numerical models or experimental measurements, but the numerical approach can be useful when it is challenging to perform experiments. One of the simplest models is the well-mixed room (WMR) model, which has shown its usefulness to predict inhalation exposure in many situations. However, since the WMR is limited to gases and vapors, it cannot be used to predict exposure to aerosols. The main objective is to modify the WMR model to expand its application to exposure scenarios involving aerosols. To reach this objective, the standard WMR model has been modified to consider the deposition of particles by gravitational settling and Brownian and turbulent deposition. Three deposition models were implemented in the model. The time-dependent concentrations of airborne particles predicted by the model were compared to experimental results conducted in a 0.512 m3 chamber. Polystyrene particles of 1, 2, and 3 µm in aerodynamic diameter were generated with a nebulizer under two air changes per hour (ACH). The well-mixed condition and chamber ACH were determined by the tracer gas decay method. The mean friction velocity on the chamber surfaces as one of the input variables for the deposition models was determined by computational fluid dynamics (CFD) simulation. For the experimental procedure, the particles were generated until reaching the steady-state condition (emission period). Then generation stopped, and concentration measurements continued until reaching the background concentration (decay period). The results of the tracer gas decay tests revealed that the ACHs of the chamber were: 1.4 and 3.0, and the well-mixed condition was achieved. The CFD results showed the average mean friction velocity and their standard deviations for the lowest and highest ACH were (8.87 ± 0.36) ×10-2 m/s and (8.88 ± 0.38) ×10-2 m/s, respectively. The numerical results indicated the difference between the predicted deposition rates by the three deposition models was less than 2%. The experimental and numerical aerosol concentrations were compared in the emission period and decay period. In both periods, the prediction accuracy of the modified model improved in comparison with the classic WMR model. However, there is still a difference between the actual value and the predicted value. In the emission period, the modified WMR results closely follow the experimental data. However, the model significantly overestimates the experimental results during the decay period. This finding is mainly due to an underestimation of the deposition rate in the model and uncertainty related to measurement devices and particle size distribution. Comparing the experimental and numerical deposition rates revealed that the actual particle deposition rate is significant, but the deposition mechanisms considered in the model were ten times lower than the experimental value. Thus, particle deposition was significant and will affect the airborne concentration in occupational settings, and it should be considered in the airborne exposure prediction model. The role of other removal mechanisms should be investigated.

Keywords: aerosol, CFD, exposure assessment, occupational settings, well-mixed room model, zonal model

Procedia PDF Downloads 73
20000 LGG Architecture for Brain Tumor Segmentation Using Convolutional Neural Network

Authors: Sajeeha Ansar, Asad Ali Safi, Sheikh Ziauddin, Ahmad R. Shahid, Faraz Ahsan

Abstract:

The most aggressive form of brain tumor is called glioma. Glioma is kind of tumor that arises from glial tissue of the brain and occurs quite often. A fully automatic 2D-CNN model for brain tumor segmentation is presented in this paper. We performed pre-processing steps to remove noise and intensity variances using N4ITK and standard intensity correction, respectively. We used Keras open-source library with Theano as backend for fast implementation of CNN model. In addition, we used BRATS 2015 MRI dataset to evaluate our proposed model. Furthermore, we have used SimpleITK open-source library in our proposed model to analyze images. Moreover, we have extracted random 2D patches for proposed 2D-CNN model for efficient brain segmentation. Extracting 2D patched instead of 3D due to less dimensional information present in 2D which helps us in reducing computational time. Dice Similarity Coefficient (DSC) is used as performance measure for the evaluation of the proposed method. Our method achieved DSC score of 0.77 for complete, 0.76 for core, 0.77 for enhanced tumor regions. However, these results are comparable with methods already implemented 2D CNN architecture.

Keywords: brain tumor segmentation, convolutional neural networks, deep learning, LGG

Procedia PDF Downloads 153
19999 Procedure Model for Data-Driven Decision Support Regarding the Integration of Renewable Energies into Industrial Energy Management

Authors: M. Graus, K. Westhoff, X. Xu

Abstract:

The climate change causes a change in all aspects of society. While the expansion of renewable energies proceeds, industry could not be convinced based on general studies about the potential of demand side management to reinforce smart grid considerations in their operational business. In this article, a procedure model for a case-specific data-driven decision support for industrial energy management based on a holistic data analytics approach is presented. The model is executed on the example of the strategic decision problem, to integrate the aspect of renewable energies into industrial energy management. This question is induced due to considerations of changing the electricity contract model from a standard rate to volatile energy prices corresponding to the energy spot market which is increasingly more affected by renewable energies. The procedure model corresponds to a data analytics process consisting on a data model, analysis, simulation and optimization step. This procedure will help to quantify the potentials of sustainable production concepts based on the data from a factory. The model is validated with data from a printer in analogy to a simple production machine. The overall goal is to establish smart grid principles for industry via the transformation from knowledge-driven to data-driven decisions within manufacturing companies.

Keywords: data analytics, green production, industrial energy management, optimization, renewable energies, simulation

Procedia PDF Downloads 411
19998 Functional and Efficient Query Interpreters: Principle, Application and Performances’ Comparison

Authors: Laurent Thiry, Michel Hassenforder

Abstract:

This paper presents a general approach to implement efficient queries’ interpreters in a functional programming language. Indeed, most of the standard tools actually available use an imperative and/or object-oriented language for the implementation (e.g. Java for Jena-Fuseki) but other paradigms are possible with, maybe, better performances. To proceed, the paper first explains how to model data structures and queries in a functional point of view. Then, it proposes a general methodology to get performances (i.e. number of computation steps to answer a query) then it explains how to integrate some optimization techniques (short-cut fusion and, more important, data transformations). It then compares the functional server proposed to a standard tool (Fuseki) demonstrating that the first one can be twice to ten times faster to answer queries.

Keywords: data transformation, functional programming, information server, optimization

Procedia PDF Downloads 124
19997 Interoperability Model Design of Smart Grid Power System

Authors: Seon-Hack Hong, Tae-Il Choi

Abstract:

Interoperability is defined as systems, components, and devices developed by different entities smoothly exchanging information and functioning organically without mutual consultation, being able to communicate with each other and computer systems of the same type or different types, and exchanging information or the ability of two or more systems to exchange information and use the information exchanged without extra effort. Insufficiencies such as duplication of functions when developing systems and applications due to lack of interoperability in the electric power system and low efficiency due to a lack of mutual information transmission system between the inside of the application program and the design is improved, and the seamless linkage of newly developed systems is improved. Since it is necessary to secure interoperability for this purpose, we designed the smart grid-based interoperability standard model in this paper.

Keywords: interoperability, power system, common information model, SCADA, IEEE2030, Zephyr

Procedia PDF Downloads 77
19996 Gaussian Probability Density for Forest Fire Detection Using Satellite Imagery

Authors: S. Benkraouda, Z. Djelloul-Khedda, B. Yagoubi

Abstract:

we present a method for early detection of forest fires from a thermal infrared satellite image, using the image matrix of the probability of belonging. The principle of the method is to compare a theoretical mathematical model to an experimental model. We considered that each line of the image matrix, as an embodiment of a non-stationary random process. Since the distribution of pixels in the satellite image is statistically dependent, we divided these lines into small stationary and ergodic intervals to characterize the image by an adequate mathematical model. A standard deviation was chosen to generate random variables, so each interval behaves naturally like white Gaussian noise. The latter has been selected as the mathematical model that represents a set of very majority pixels, which we can be considered as the image background. Before modeling the image, we made a few pretreatments, then the parameters of the theoretical Gaussian model were extracted from the modeled image, these settings will be used to calculate the probability of each interval of the modeled image to belong to the theoretical Gaussian model. The high intensities pixels are regarded as foreign elements to it, so they will have a low probability, and the pixels that belong to the background image will have a high probability. Finally, we did present the reverse of the matrix of probabilities of these intervals for a better fire detection.

Keywords: forest fire, forest fire detection, satellite image, normal distribution, theoretical gaussian model, thermal infrared matrix image

Procedia PDF Downloads 111
19995 Numerical Investigation of Multiphase Flow in Pipelines

Authors: Gozel Judakova, Markus Bause

Abstract:

We present and analyze reliable numerical techniques for simulating complex flow and transport phenomena related to natural gas transportation in pipelines. Such kind of problems are of high interest in the field of petroleum and environmental engineering. Modeling and understanding natural gas flow and transformation processes during transportation is important for the sake of physical realism and the design and operation of pipeline systems. In our approach a two fluid flow model based on a system of coupled hyperbolic conservation laws is considered for describing natural gas flow undergoing hydratization. The accurate numerical approximation of two-phase gas flow remains subject of strong interest in the scientific community. Such hyperbolic problems are characterized by solutions with steep gradients or discontinuities, and their approximation by standard finite element techniques typically gives rise to spurious oscillations and numerical artefacts. Recently, stabilized and discontinuous Galerkin finite element techniques have attracted researchers’ interest. They are highly adapted to the hyperbolic nature of our two-phase flow model. In the presentation a streamline upwind Petrov-Galerkin approach and a discontinuous Galerkin finite element method for the numerical approximation of our flow model of two coupled systems of Euler equations are presented. Then the efficiency and reliability of stabilized continuous and discontinous finite element methods for the approximation is carefully analyzed and the potential of the either classes of numerical schemes is investigated. In particular, standard benchmark problems of two-phase flow like the shock tube problem are used for the comparative numerical study.

Keywords: discontinuous Galerkin method, Euler system, inviscid two-fluid model, streamline upwind Petrov-Galerkin method, twophase flow

Procedia PDF Downloads 296
19994 Reliability-Based Life-Cycle Cost Model for Engineering Systems

Authors: Reza Lotfalian, Sudarshan Martins, Peter Radziszewski

Abstract:

The effect of reliability on life-cycle cost, including initial and maintenance cost of a system is studied. The failure probability of a component is used to calculate the average maintenance cost during the operation cycle of the component. The standard deviation of the life-cycle cost is also calculated as an error measure for the average life-cycle cost. As a numerical example, the model is used to study the average life cycle cost of an electric motor.

Keywords: initial cost, life-cycle cost, maintenance cost, reliability

Procedia PDF Downloads 562
19993 Network Word Discovery Framework Based on Sentence Semantic Vector Similarity

Authors: Ganfeng Yu, Yuefeng Ma, Shanliang Yang

Abstract:

The word discovery is a key problem in text information retrieval technology. Methods in new word discovery tend to be closely related to words because they generally obtain new word results by analyzing words. With the popularity of social networks, individual netizens and online self-media have generated various network texts for the convenience of online life, including network words that are far from standard Chinese expression. How detect network words is one of the important goals in the field of text information retrieval today. In this paper, we integrate the word embedding model and clustering methods to propose a network word discovery framework based on sentence semantic similarity (S³-NWD) to detect network words effectively from the corpus. This framework constructs sentence semantic vectors through a distributed representation model, uses the similarity of sentence semantic vectors to determine the semantic relationship between sentences, and finally realizes network word discovery by the meaning of semantic replacement between sentences. The experiment verifies that the framework not only completes the rapid discovery of network words but also realizes the standard word meaning of the discovery of network words, which reflects the effectiveness of our work.

Keywords: text information retrieval, natural language processing, new word discovery, information extraction

Procedia PDF Downloads 60
19992 The Prospects and Challenges of Adopting an Environmental Management System by Higher Education Institutions in Lebanon

Authors: May A. Massoud, R. Harissi

Abstract:

The fundamental principle and overall goal of an Environmental Management System is the concept of continual improvement. The implementation of such a system reveals a commitment to compliance and sustainable development. This research project aims at identifying and evaluating the prospects and challenges facing the adoption of ISO 14001 standard in the higher education system of Lebanon. It examines the corresponding barriers, drivers and incentives associated with the implementation of the standard. For this purpose, primary data were collected using quantitative method. The results revealed a significant lack of knowledge and sense of responsibility towards ISO 14001 standard and environmental accountability. Improving educational and social responsibility, improving environmental performance and enhancing institution image are the most noticeable drivers to adopt ISO 14001. The main perceived barriers for acquiring the standard are unclear benefits of ISO 14001, the lack of government support and the fact that the standard is not seen as a priority by top management. Lebanese Higher Education institutions are far likely to consider ISO 14001 before having proper accreditation programs or until ISO 14001 become widely-known in the Lebanese economic sectors.

Keywords: ISO 14001, higher education institution, environmental management, system

Procedia PDF Downloads 399
19991 Detection of Cardiac Arrhythmia Using Principal Component Analysis and Xgboost Model

Authors: Sujay Kotwale, Ramasubba Reddy M.

Abstract:

Electrocardiogram (ECG) is a non-invasive technique used to study and analyze various heart diseases. Cardiac arrhythmia is a serious heart disease which leads to death of the patients, when left untreated. An early-time detection of cardiac arrhythmia would help the doctors to do proper treatment of the heart. In the past, various algorithms and machine learning (ML) models were used to early-time detection of cardiac arrhythmia, but few of them have achieved better results. In order to improve the performance, this paper implements principal component analysis (PCA) along with XGBoost model. The PCA was implemented to the raw ECG signals which suppress redundancy information and extracted significant features. The obtained significant ECG features were fed into XGBoost model and the performance of the model was evaluated. In order to valid the proposed technique, raw ECG signals obtained from standard MIT-BIH database were employed for the analysis. The result shows that the performance of proposed method is superior to the several state-of-the-arts techniques.

Keywords: cardiac arrhythmia, electrocardiogram, principal component analysis, XGBoost

Procedia PDF Downloads 88
19990 Onco@Home: Comparing the Costs, Revenues, and Patient Experience of Cancer Treatment at Home with the Standard of Care

Authors: Sarah Misplon, Wim Marneffe, Johan Helling, Jana Missiaen, Inge Decock, Dries Myny, Steve Lervant, Koen Vaneygen

Abstract:

The aim of this study was twofold. First, we investigated whether the current funding from the national health insurance (NHI) of home hospitalization (HH) for oncological patients is sufficient in Belgium. Second, we compared patient’s experiences and preferences of HH to the standard of care (SOC). Two HH models were examined in three Belgian hospitals and three home nursing organizations. In a first HH model, the blood draw and monitoring prior to intravenous therapy were performed by a trained home nurse at the patient’s home the day before the visit to the day hospital. In a second HH model, the administration of two subcutaneous treatments was partly provided at home instead of in the hospital. Therefore, we conducted (1) a bottom-up micro-costing study to compare the costs and revenues for the providers (hospitals and home care organizations), and (2) a cross-sectional survey to compare patient’s experiences and preferences of the SOC group and the HH group. Our results show that HH patients prefer HH and none of them wanted to return to SOC, although the satisfaction of patients was not significantly different between the two categories. At the same time, we find that costs associated to HH are higher overall. Comparing revenues with costs, we conclude that the current funding from NHI of HH for oncological patients is insufficient.

Keywords: cost analysis, health insurance, preference, home hospitalization

Procedia PDF Downloads 89
19989 Effects of the Air Supply Outlets Geometry on Human Comfort inside Living Rooms: CFD vs. ADPI

Authors: Taher M. Abou-deif, Esmail M. El-Bialy, Essam E. Khalil

Abstract:

The paper is devoted to numerically investigating the influence of the air supply outlets geometry on human comfort inside living looms. A computational fluid dynamics model is developed to examine the air flow characteristics of a room with different supply air diffusers. The work focuses on air flow patterns, thermal behavior in the room with few number of occupants. As an input to the full-scale 3-D room model, a 2-D air supply diffuser model that supplies direction and magnitude of air flow into the room is developed. Air distribution effect on thermal comfort parameters was investigated depending on changing the air supply diffusers type, angles and velocity. Air supply diffusers locations and numbers were also investigated. The pre-processor Gambit is used to create the geometric model with parametric features. Commercially available simulation software “Fluent 6.3” is incorporated to solve the differential equations governing the conservation of mass, three momentum and energy in the processing of air flow distribution. Turbulence effects of the flow are represented by the well-developed two equation turbulence model. In this work, the so-called standard k-ε turbulence model, one of the most widespread turbulence models for industrial applications, was utilized. Basic parameters included in this work are air dry bulb temperature, air velocity, relative humidity and turbulence parameters are used for numerical predictions of indoor air distribution and thermal comfort. The thermal comfort predictions through this work were based on ADPI (Air Diffusion Performance Index),the PMV (Predicted Mean Vote) model and the PPD (Percentage People Dissatisfied) model, the PMV and PPD were estimated using Fanger’s model.

Keywords: thermal comfort, Fanger's model, ADPI, energy effeciency

Procedia PDF Downloads 367
19988 Numerical Analysis of the Effect of Height and Rate of Fluid Flow on a Stepped Spillway

Authors: Amir Abbas Kamanbedast, Abbas Saki

Abstract:

Stepped spillways are composed of several steps, which start from around the spillway crest and continue to the downstream heel. Recently, such spillways have been receiving increasing attention due to the significant effect of the associated stairs on the flow’s rate of energy dissipation. Energy dissipation in the stepped spillways across the overflow can be explained by the watercourse contact with the stairs (i.e., large, harsh surfaces). In this context, less energy must be dissipated at the end of the spillway, and, hence, a smaller (less expensive) energy-dissipating structure is required. In this study, a stepped spillway was simulated using the model Fluent 3, and a standard model was used to model the flow disturbance. For this purpose, the energy dissipation from the stepped spillway was investigated in terms of the different numbers of stairs involved. Using k-ε, the disturbances of the numerical method for velocity and of flow depth at the downstream overflow were obtained, and, then, the energy that was dissipated throughout the spillway was calculated. Our results showed that an increase in the number of stairs can considerably increase the amount of energy dissipation for the fixed, upstream energy. In addition, the results of the numerical analyses were provided as isobar and velocity curves so points that were sensitive to cavitation could be determined.

Keywords: stepped spillway, fluent software, turbulence model of k-ε, VOF model

Procedia PDF Downloads 267
19987 Ubiquitous Collaborative Mobile Learning (UCML): A Flexible Instructional Design Model for Social Learning

Authors: Hameed Olalekan Bolaji

Abstract:

The digital natives are driving the trends of literacy in the use of electronic devices for learning purposes. This has reconfigured the context of learning in the exploration of knowledge in a social learning environment. This study explores the impact of Ubiquitous Collaborative Mobile Learning (UCML) instructional design model in a quantitative designed-based research approach. The UCML model was a synergetic blend of four models that are relevant to the design of instructional content for a social learning environment. The UCML model serves as the treatment and instructions were transmitted via mobile device based on the principle of ‘bring your own device’ (BYOD) to promote social learning. Three research questions and two hypotheses were raised to guide the conduct of this study. A researcher-designed questionnaire was used to collate data and the it was subjected to reliability of Cronbach Alpha which yielded 0.91. Descriptive statistics of mean and standard deviation were used to answer research questions while inferential statistics of independent sample t-test was used to analyze the hypotheses. The findings reveal that the UCML model was adequately evolved and it promotes social learning its design principles through the use of mobile devices.

Keywords: collaboration, mobile device, social learning, ubiquitous

Procedia PDF Downloads 116
19986 Virtual Test Model for Qualification of Knee Prosthesis

Authors: K. Zehouani, I. Oldal

Abstract:

Purpose: In the human knee joint, degenerative joint disease may happen with time. The standard treatment of this disease is the total knee replacement through prosthesis implanting. The reason lies in the fact that this phenomenon causes different material abrasion as compare to pure sliding or rolling alone. This study focuses on developing a knee prosthesis geometry, which fulfills the mechanical and kinematical requirements. Method: The MSC ADAMS program is used to describe the rotation of the human knee joint as a function of flexion, and to investigate how the flexion and rotation movement changes between the condyles of a multi-body model of the knee prosthesis as a function of flexion angle (in the functional arc of the knee (20-120º)). Moreover, the multi-body model with identical boundary conditions is constituted, and the numerical simulations are carried out using the MSC ADAMS program system. Results: It is concluded that the use of the multi-body model reduces time and cost since it does not need to manufacture the tibia and the femur as it requires for the knee prosthesis of the test machine. Moreover, without measuring or by dispensing with a test machine for the knee prosthesis geometry, approximation of the results of our model to a human knee is carried out directly. Conclusion: The pattern obtained by the multi-body model provides an insight for future experimental tests related to the rotation and flexion of the knee joint concerning the actual average and friction load.

Keywords: biomechanics, knee joint, rotation, flexion, kinematics, MSC ADAMS

Procedia PDF Downloads 111
19985 Kinetic Study of Thermal Degradation of a Lignin Nanoparticle-Reinforced Phenolic Foam

Authors: Juan C. Domínguez, Belén Del Saz-Orozco, María V. Alonso, Mercedes Oliet, Francisco Rodríguez

Abstract:

In the present study, the kinetics of thermal degradation of a phenolic and lignin reinforced phenolic foams, and the lignin used as reinforcement were studied and the activation energies of their degradation processes were obtained by a DAEM model. The average values for five heating rates of the mean activation energies obtained were: 99.1, 128.2, and 144.0 kJ.mol-1 for the phenolic foam, 109.5, 113.3, and 153.0 kJ.mol-1 for the lignin reinforcement, and 82.1, 106.9, and 124.4 kJ. mol-1 for the lignin reinforced phenolic foam. The standard deviation ranges calculated for each sample were 1.27-8.85, 2.22-12.82, and 3.17-8.11 kJ.mol-1 for the phenolic foam, lignin and the reinforced foam, respectively. The DAEM model showed low mean square errors (< 1x10-5), proving that is a suitable model to study the kinetics of thermal degradation of the foams and the reinforcement.

Keywords: kinetics, lignin, phenolic foam, thermal degradation

Procedia PDF Downloads 453