Search results for: computational model(s)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7949

Search results for: computational model(s)

7679 Modelling Mode Choice Behaviour Using Cloud Theory

Authors: Leah Wright, Trevor Townsend

Abstract:

Mode choice models are crucial instruments in the analysis of travel behaviour. These models show the relationship between an individual’s choice of transportation mode for a given O-D pair and the individual’s socioeconomic characteristics such as household size and income level, age and/or gender, and the features of the transportation system. The most popular functional forms of these models are based on Utility-Based Choice Theory, which addresses the uncertainty in the decision-making process with the use of an error term. However, with the development of artificial intelligence, many researchers have started to take a different approach to travel demand modelling. In recent times, researchers have looked at using neural networks, fuzzy logic and rough set theory to develop improved mode choice formulas. The concept of cloud theory has recently been introduced to model decision-making under uncertainty. Unlike the previously mentioned theories, cloud theory recognises a relationship between randomness and fuzziness, two of the most common types of uncertainty. This research aims to investigate the use of cloud theory in mode choice models. This paper highlights the conceptual framework of the mode choice model using cloud theory. Merging decision-making under uncertainty and mode choice models is state of the art. The cloud theory model is expected to address the issues and concerns with the nested logit and improve the design of mode choice models and their use in travel demand.

Keywords: Cloud theory, decision-making, mode choice models, travel behaviour, uncertainty

Procedia PDF Downloads 352
7678 Simulation of Channel Models for Device-to-Device Application of 5G Urban Microcell Scenario

Authors: H. Zormati, J. Chebil, J. Bel Hadj Tahar

Abstract:

Next generation wireless transmission technology (5G) is expected to support the development of channel models for higher frequency bands, so clarification of high frequency bands is the most important issue in radio propagation research for 5G, multiple urban microcellular measurements have been carried out at 60 GHz. In this paper, the collected data is uniformly analyzed with focus on the path loss (PL), the objective is to compare simulation results of some studied channel models with the purpose of testing the performance of each one.

Keywords: 5G, channel model, 60GHz channel, millimeter-wave, urban microcell

Procedia PDF Downloads 280
7677 A Fast Convergence Subband BSS Structure

Authors: Salah Al-Din I. Badran, Samad Ahmadi, Ismail Shahin

Abstract:

A blind source separation method is proposed; in this method we use a non-uniform filter bank and a novel normalisation. This method provides a reduced computational complexity and increased convergence speed comparing to the full-band algorithm. Recently, adaptive sub-band scheme has been recommended to solve two problems: reduction of computational complexity and increase the convergence speed of the adaptive algorithm for correlated input signals. In this work the reduction in computational complexity is achieved with the use of adaptive filters of orders less than the full-band adaptive filters, which operate at a sampling rate lower than the sampling rate of the input signal. The decomposed signals by analysis bank filter are less correlated in each sub-band than the input signal at full bandwidth, and can promote better rates of convergence.

Keywords: blind source separation, computational complexity, subband, convergence speed, mixture

Procedia PDF Downloads 524
7676 Contrasted Mean and Median Models in Egyptian Stock Markets

Authors: Mai A. Ibrahim, Mohammed El-Beltagy, Motaz Khorshid

Abstract:

Emerging Markets return distributions have shown significance departure from normality were they are characterized by fatter tails relative to the normal distribution and exhibit levels of skewness and kurtosis that constitute a significant departure from normality. Therefore, the classical Markowitz Mean-Variance is not applicable for emerging markets since it assumes normally-distributed returns (with zero skewness and kurtosis) and a quadratic utility function. Moreover, the Markowitz mean-variance analysis can be used in cases of moderate non-normality and it still provides a good approximation of the expected utility, but it may be ineffective under large departure from normality. Higher moments models and median models have been suggested in the literature for asset allocation in this case. Higher moments models have been introduced to account for the insufficiency of the description of a portfolio by only its first two moments while the median model has been introduced as a robust statistic which is less affected by outliers than the mean. Tail risk measures such as Value-at Risk (VaR) and Conditional Value-at-Risk (CVaR) have been introduced instead of Variance to capture the effect of risk. In this research, higher moment models including the Mean-Variance-Skewness (MVS) and Mean-Variance-Skewness-Kurtosis (MVSK) are formulated as single-objective non-linear programming problems (NLP) and median models including the Median-Value at Risk (MedVaR) and Median-Mean Absolute Deviation (MedMAD) are formulated as a single-objective mixed-integer linear programming (MILP) problems. The higher moment models and median models are compared to some benchmark portfolios and tested on real financial data in the Egyptian main Index EGX30. The results show that all the median models outperform the higher moment models were they provide higher final wealth for the investor over the entire period of study. In addition, the results have confirmed the inapplicability of the classical Markowitz Mean-Variance to the Egyptian stock market as it resulted in very low realized profits.

Keywords: Egyptian stock exchange, emerging markets, higher moment models, median models, mixed-integer linear programming, non-linear programming

Procedia PDF Downloads 284
7675 The Influence of Covariance Hankel Matrix Dimension on Algorithms for VARMA Models

Authors: Celina Pestano-Gabino, Concepcion Gonzalez-Concepcion, M. Candelaria Gil-Fariña

Abstract:

Some estimation methods for VARMA models, and Multivariate Time Series Models in general, rely on the use of a Hankel matrix. It is known that if the data sample is populous enough and the dimension of the Hankel matrix is unnecessarily large, this may result in an unnecessary number of computations as well as in numerical problems. In this sense, the aim of this paper is two-fold. First, we provide some theoretical results for these matrices which translate into a lower dimension for the matrices normally used in the algorithms. This contribution thus serves to improve those methods from a numerical and, presumably, statistical point of view. Second, we have chosen an estimation algorithm to illustrate in practice our improvements. The results we obtained in a simulation of VARMA models show that an increase in the size of the Hankel matrix beyond the theoretical bound proposed as valid does not necessarily lead to improved practical results. Therefore, for future research, we propose conducting similar studies using any of the linear system estimation methods that depend on Hankel matrices.

Keywords: covariances Hankel matrices, Kronecker indices, system identification, VARMA models

Procedia PDF Downloads 215
7674 Bayesian Parameter Inference for Continuous Time Markov Chains with Intractable Likelihood

Authors: Randa Alharbi, Vladislav Vyshemirsky

Abstract:

Systems biology is an important field in science which focuses on studying behaviour of biological systems. Modelling is required to produce detailed description of the elements of a biological system, their function, and their interactions. A well-designed model requires selecting a suitable mechanism which can capture the main features of the system, define the essential components of the system and represent an appropriate law that can define the interactions between its components. Complex biological systems exhibit stochastic behaviour. Thus, using probabilistic models are suitable to describe and analyse biological systems. Continuous-Time Markov Chain (CTMC) is one of the probabilistic models that describe the system as a set of discrete states with continuous time transitions between them. The system is then characterised by a set of probability distributions that describe the transition from one state to another at a given time. The evolution of these probabilities through time can be obtained by chemical master equation which is analytically intractable but it can be simulated. Uncertain parameters of such a model can be inferred using methods of Bayesian inference. Yet, inference in such a complex system is challenging as it requires the evaluation of the likelihood which is intractable in most cases. There are different statistical methods that allow simulating from the model despite intractability of the likelihood. Approximate Bayesian computation is a common approach for tackling inference which relies on simulation of the model to approximate the intractable likelihood. Particle Markov chain Monte Carlo (PMCMC) is another approach which is based on using sequential Monte Carlo to estimate intractable likelihood. However, both methods are computationally expensive. In this paper we discuss the efficiency and possible practical issues for each method, taking into account the computational time for these methods. We demonstrate likelihood-free inference by performing analysing a model of the Repressilator using both methods. Detailed investigation is performed to quantify the difference between these methods in terms of efficiency and computational cost.

Keywords: Approximate Bayesian computation(ABC), Continuous-Time Markov Chains, Sequential Monte Carlo, Particle Markov chain Monte Carlo (PMCMC)

Procedia PDF Downloads 179
7673 Energy Consumption Models for Electric Vehicles: Survey and Proposal of a More Realistic Model

Authors: I. Sagaama, A. Kechiche, W. Trojet, F. Kamoun

Abstract:

Replacing combustion engine vehicles by electric vehicles (EVs) is a major step in recent years due to their potential benefits. Battery autonomy and charging processes are still a big issue for that kind of vehicles. Therefore, reducing the energy consumption of electric vehicles becomes a necessity. Many researches target introducing recent information and communication technologies in EVs in order to propose reducing energy consumption services. Evaluation of realistic scenarios is a big challenge nowadays. In this paper, we will elaborate a state of the art of different proposed energy consumption models in the literature, then we will present a comparative study of these models, finally, we will extend previous works in order to propose an accurate and realistic energy model for calculating instantaneous power consumption of electric vehicles.

Keywords: electric vehicle, vehicular networks, energy models, traffic simulation

Procedia PDF Downloads 329
7672 Generation of 3d Models Obtained with Low-Cost RGB and Thermal Sensors Mounted on Drones

Authors: Julio Manuel De Luis Ruiz, Javier Sedano Cibrián, RubéN Pérez Álvarez, Raúl Pereda García, Felipe Piña García

Abstract:

Nowadays it is common to resort to aerial photography to carry out the prospection and/or exploration of archaeological sites. In this sense, the classic 3D models are being applied to investigate the direction towards which the generally subterranean structures of an archaeological site may continue and therefore, to help in making the decisions that define the location of new excavations. In recent years, Unmanned Aerial Vehicles (UAVs) have been applied as the vehicles that carry the sensor. This implies certain advantages, such as the possibility of including low-cost sensors, given that these vehicles can carry the sensor at relatively low altitudes. Due to this, low-cost dual sensors have recently begun to be used. This new equipment can collaborate with classic Digital Elevation Models (DEMs) in the exploration of archaeological sites, but this entails the need for a methodological setting to optimise the acquisition, processing and exploitation of the information provided by low-cost dual sensors. This research focuses on the design of an appropriate workflow to obtain 3D models with low-cost sensors carried on UAVs, both in the RGB and thermal domains. All the foregoing has been applied to the archaeological site of Juliobriga, located in Cantabria (Spain).

Keywords: process optimization, RGB models, thermal models, , UAV, workflow

Procedia PDF Downloads 109
7671 Heterogeneous Intelligence Traders and Market Efficiency: New Evidence from Computational Approach in Artificial Stock Markets

Authors: Yosra Mefteh Rekik

Abstract:

A computational agent-based model of financial markets stresses interactions and dynamics among a very diverse set of traders. The growing body of research in this area relies heavily on computational tools which by-pass the restrictions of an analytical method. The main goal of this research is to understand how the stock market operates and behaves how to invest in the stock market and to study traders’ behavior within the context of the artificial stock markets populated by heterogeneous agents. All agents are characterized by adaptive learning behavior represented by the Artificial Neuron Networks. By using agent-based simulations on artificial market, we show that the existence of heterogeneous agents can explain the price dynamics in the financial market. We investigate the relation between market diversity and market efficiency. Our empirical findings demonstrate that greater market heterogeneity play key roles in market efficiency.

Keywords: agent-based modeling, artificial stock market, heterogeneous expectations, financial stylized facts, computational finance

Procedia PDF Downloads 399
7670 Recognition of Cursive Arabic Handwritten Text Using Embedded Training Based on Hidden Markov Models (HMMs)

Authors: Rabi Mouhcine, Amrouch Mustapha, Mahani Zouhir, Mammass Driss

Abstract:

In this paper, we present a system for offline recognition cursive Arabic handwritten text based on Hidden Markov Models (HMMs). The system is analytical without explicit segmentation used embedded training to perform and enhance the character models. Extraction features preceded by baseline estimation are statistical and geometric to integrate both the peculiarities of the text and the pixel distribution characteristics in the word image. These features are modelled using hidden Markov models and trained by embedded training. The experiments on images of the benchmark IFN/ENIT database show that the proposed system improves recognition.

Keywords: recognition, handwriting, Arabic text, HMMs, embedded training

Procedia PDF Downloads 325
7669 In Silico Modeling of Drugs Milk/Plasma Ratio in Human Breast Milk Using Structures Descriptors

Authors: Navid Kaboudi, Ali Shayanfar

Abstract:

Introduction: Feeding infants with safe milk from the beginning of their life is an important issue. Drugs which are used by mothers can affect the composition of milk in a way that is not only unsuitable, but also toxic for infants. Consuming permeable drugs during that sensitive period by mother could lead to serious side effects to the infant. Due to the ethical restrictions of drug testing on humans, especially women, during their lactation period, computational approaches based on structural parameters could be useful. The aim of this study is to develop mechanistic models to predict the M/P ratio of drugs during breastfeeding period based on their structural descriptors. Methods: Two hundred and nine different chemicals with their M/P ratio were used in this study. All drugs were categorized into two groups based on their M/P value as Malone classification: 1: Drugs with M/P>1, which are considered as high risk 2: Drugs with M/P>1, which are considered as low risk Thirty eight chemical descriptors were calculated by ACD/labs 6.00 and Data warrior software in order to assess the penetration during breastfeeding period. Later on, four specific models based on the number of hydrogen bond acceptors, polar surface area, total surface area, and number of acidic oxygen were established for the prediction. The mentioned descriptors can predict the penetration with an acceptable accuracy. For the remaining compounds (N= 147, 158, 160, and 174 for models 1 to 4, respectively) of each model binary regression with SPSS 21 was done in order to give us a model to predict the penetration ratio of compounds. Only structural descriptors with p-value<0.1 remained in the final model. Results and discussion: Four different models based on the number of hydrogen bond acceptors, polar surface area, and total surface area were obtained in order to predict the penetration of drugs into human milk during breastfeeding period About 3-4% of milk consists of lipids, and the amount of lipid after parturition increases. Lipid soluble drugs diffuse alongside with fats from plasma to mammary glands. lipophilicity plays a vital role in predicting the penetration class of drugs during lactation period. It was shown in the logistic regression models that compounds with number of hydrogen bond acceptors, PSA and TSA above 5, 90 and 25 respectively, are less permeable to milk because they are less soluble in the amount of fats in milk. The pH of milk is acidic and due to that, basic compounds tend to be concentrated in milk than plasma while acidic compounds may consist lower concentrations in milk than plasma. Conclusion: In this study, we developed four regression-based models to predict the penetration class of drugs during the lactation period. The obtained models can lead to a higher speed in drug development process, saving energy, and costs. Milk/plasma ratio assessment of drugs requires multiple steps of animal testing, which has its own ethical issues. QSAR modeling could help scientist to reduce the amount of animal testing, and our models are also eligible to do that.

Keywords: logistic regression, breastfeeding, descriptors, penetration

Procedia PDF Downloads 39
7668 Stochastic Age-Structured Population Models

Authors: Arcady Ponosov

Abstract:

Many well-known age-structured population models are derived from the celebrated McKendrick-von Foerster equation (MFE), also called the biological conservation law. A similar technique is suggested for the stochastically perturbed MFE. This technique is shown to produce stochastic versions of the deterministic population models, which appear to be very different from those one can construct by simply appending additive stochasticity to deterministic equations. In particular, it is shown that stochastic Nicholson’s blowflies model should contain both additive and multiplicative stochastic noises. The suggested transformation technique is similar to that used in the deterministic case. The difference is hidden in the formulas for the exact solutions of the simplified boundary value problem for the stochastically perturbed MFE. The analysis is also based on the theory of stochastic delay differential equations.

Keywords: boundary value problems, population models, stochastic delay differential equations, stochastic partial differential equation

Procedia PDF Downloads 217
7667 MITOS-RCNN: Mitotic Figure Detection in Breast Cancer Histopathology Images Using Region Based Convolutional Neural Networks

Authors: Siddhant Rao

Abstract:

Studies estimate that there will be 266,120 new cases of invasive breast cancer and 40,920 breast cancer induced deaths in the year of 2018 alone. Despite the pervasiveness of this affliction, the current process to obtain an accurate breast cancer prognosis is tedious and time consuming. It usually requires a trained pathologist to manually examine histopathological images and identify the features that characterize various cancer severity levels. We propose MITOS-RCNN: a region based convolutional neural network (RCNN) geared for small object detection to accurately grade one of the three factors that characterize tumor belligerence described by the Nottingham Grading System: mitotic count. Other computational approaches to mitotic figure counting and detection do not demonstrate ample recall or precision to be clinically viable. Our models outperformed all previous participants in the ICPR 2012 challenge, the AMIDA 2013 challenge and the MITOS-ATYPIA-14 challenge along with recently published works. Our model achieved an F- measure score of 0.955, a 6.11% improvement in accuracy from the most accurate of the previously proposed models.

Keywords: breast cancer, mitotic count, machine learning, convolutional neural networks

Procedia PDF Downloads 190
7666 A Comparative Analysis of the Performance of COSMO and WRF Models in Quantitative Rainfall Prediction

Authors: Isaac Mugume, Charles Basalirwa, Daniel Waiswa, Mary Nsabagwa, Triphonia Jacob Ngailo, Joachim Reuder, Sch¨attler Ulrich, Musa Semujju

Abstract:

The Numerical weather prediction (NWP) models are considered powerful tools for guiding quantitative rainfall prediction. A couple of NWP models exist and are used at many operational weather prediction centers. This study considers two models namely the Consortium for Small–scale Modeling (COSMO) model and the Weather Research and Forecasting (WRF) model. It compares the models’ ability to predict rainfall over Uganda for the period 21st April 2013 to 10th May 2013 using the root mean square (RMSE) and the mean error (ME). In comparing the performance of the models, this study assesses their ability to predict light rainfall events and extreme rainfall events. All the experiments used the default parameterization configurations and with same horizontal resolution (7 Km). The results show that COSMO model had a tendency of largely predicting no rain which explained its under–prediction. The COSMO model (RMSE: 14.16; ME: -5.91) presented a significantly (p = 0.014) higher magnitude of error compared to the WRF model (RMSE: 11.86; ME: -1.09). However the COSMO model (RMSE: 3.85; ME: 1.39) performed significantly (p = 0.003) better than the WRF model (RMSE: 8.14; ME: 5.30) in simulating light rainfall events. All the models under–predicted extreme rainfall events with the COSMO model (RMSE: 43.63; ME: -39.58) presenting significantly higher error magnitudes than the WRF model (RMSE: 35.14; ME: -26.95). This study recommends additional diagnosis of the models’ treatment of deep convection over the tropics.

Keywords: comparative performance, the COSMO model, the WRF model, light rainfall events, extreme rainfall events

Procedia PDF Downloads 234
7665 From Problem Space to Executional Architecture: The Development of a Simulator to Examine the Effect of Autonomy on Mainline Rail Capacity

Authors: Emily J. Morey, Kevin Galvin, Thomas Riley, R. Eddie Wilson

Abstract:

The key challenges faced by integrating autonomous rail operations into the existing mainline railway environment have been identified through the understanding and framing of the problem space and stakeholder analysis. This was achieved through the completion of the first four steps of Soft Systems Methodology, where the problem space has been expressed via conceptual models. Having identified these challenges, we investigated one of them, namely capacity, via the use of models and simulation. This paper examines the approach used to move from the conceptual models to a simulation which can determine whether the integration of autonomous trains can plausibly increase capacity. Within this approach, we developed an architecture and converted logical models into physical resource models and associated design features which were used to build a simulator. From this simulator, we are able to analyse mixtures of legacy-autonomous operations and produce fundamental diagrams and trajectory plots to describe the dynamic behaviour of mixed mainline railway operations.

Keywords: autonomy, executable architecture, modelling and simulation, railway capacity

Procedia PDF Downloads 53
7664 Initial Concept of Islamic Social Entrepreneurship: Identification of Research Gap from Existing Model

Authors: Mohd Adib Abd Muin

Abstract:

Social entrepreneurship has become a new phenomenon in a country in order to reduce social problems and eradicate poverty communities. However, the study based on Islamic social entrepreneurship from the social entrepreneurial activity is still new especially in the Islamic perspective. In addition, this research found that is lacking of model on social entrepreneurship that focus on Islamic perspective. Therefore, the objective of this paper is to identify the issues and research gap based on Islamic perspective from existing models and to develop a concept of Islamic social entrepreneurship according to Islamic perspective and Maqasid Shari’ah. The research method used in this study is literature review and comparative analysis from 11 existing models of social entrepreneurship. The research finding shows that 11 existing models on social entrepreneurship has been analyzed and it shows that the existing models on social entrepreneurship do not emphasize on Islamic perspective.

Keywords: component, social entrepreneurship, Islamic perspective, research gap

Procedia PDF Downloads 414
7663 Air Quality Analysis Using Machine Learning Models Under Python Environment

Authors: Salahaeddine Sbai

Abstract:

Air quality analysis using machine learning models is a method employed to assess and predict air pollution levels. This approach leverages the capabilities of machine learning algorithms to analyze vast amounts of air quality data and extract valuable insights. By training these models on historical air quality data, they can learn patterns and relationships between various factors such as weather conditions, pollutant emissions, and geographical features. The trained models can then be used to predict air quality levels in real-time or forecast future pollution levels. This application of machine learning in air quality analysis enables policymakers, environmental agencies, and the general public to make informed decisions regarding health, environmental impact, and mitigation strategies. By understanding the factors influencing air quality, interventions can be implemented to reduce pollution levels, mitigate health risks, and enhance overall air quality management. Climate change is having significant impacts on Morocco, affecting various aspects of the country's environment, economy, and society. In this study, we use some machine learning models under python environment to predict and analysis air quality change over North of Morocco to evaluate the climate change impact on agriculture.

Keywords: air quality, machine learning models, pollution, pollutant emissions

Procedia PDF Downloads 57
7662 A Review of Literature on Theories of Construction Accident Causation Models

Authors: Samuel Opeyemi Williams, Razali Bin Adul Hamid, M. S. Misnan, Taki Eddine Seghier, D. I. Ajayi

Abstract:

Construction sites are characterized with occupational risks. Review of literature on construction accidents reveals that a lot of theories have been propounded over the years by different theorists, coupled with multifarious models developed by different proponents at different times. Accidents are unplanned events that are prominent in construction sites, involving materials, objects and people with attendant damages, loses and injuries. Models were developed to investigate the causations of accident with the aim of preventing its occurrence. Though, some of these theories were criticized, most especially, the Heinrich Domino theory, being mostly faulted for placing much blame on operatives rather than the management. The purpose of this paper is to unravel the significant construction accident causation theories and models for the benefit of understanding of the theories, and consequently enabling construction stakeholders identify the possible potential hazards on construction sites, as all stakeholders have significant roles to play in preventing accident. Accidents are preventable; hence, understanding the risk factors of accident and the causation theories paves way for its prevention. However, findings reveal that still some gaps missing in the existing models, while it is recommended that further research can be made in order to develop more models in order to maintain zero accident on construction sites.

Keywords: domino theory, construction site, site safety, accident causation model

Procedia PDF Downloads 279
7661 Modelling and Simulation of Diffusion Effect on the Glycol Dehydration Unit of a Natural Gas Plant

Authors: M. Wigwe, J. G Akpa, E. N Wami

Abstract:

Mathematical models of the absorber of a glycol dehydration facility was developed using the principles of conservation of mass and energy. Models which predict variation of the water content of gas in mole fraction, variation of gas and liquid temperatures across the parking height were developed. These models contain contributions from bulk and diffusion flows. The effect of diffusion on the process occurring in the absorber was studied in this work. The models were validated using the initial conditions in the plant data from Company W TEG unit in Nigeria. The results obtained showed that the effect of diffusion was noticed between z=0 and z=0.004 m. A deviation from plant data of 0% was observed for the gas water content at a residence time of 20 seconds, at z=0.004 m. Similarly, deviations of 1.584% and 2.844% were observed for the gas and TEG temperatures.

Keywords: separations, absorption, simulation, dehydration, water content, triethylene glycol

Procedia PDF Downloads 466
7660 Comparative Study of Experimental and Theoretical Convective, Evaporative for Two Model Distiller

Authors: Khaoula Hidouri, Ali Benhmidene, Bechir Chouachi

Abstract:

The purification of brackish seawater becomes a necessity and not a choice against demographic and industrial growth especially in third world countries. Two models can be used in this work: simple solar still and simple solar still coupled with a heat pump. In this research, the productivity of water by Simple Solar Distiller (SSD) and Simple Solar Distiller Hybrid Heat Pump (SSDHP) was determined by the orientation, the use of heat pump, the simple or double glass cover. The productivity can exceed 1.2 L/m²h for the SSDHP and 0.5 L/m²h for SSD model. The result of the global efficiency is determined for two models SSD and SSDHP give respectively 30%, 50%. The internal efficiency attained 35% for SSD and 60% of the SSDHP models. Convective heat coefficient can be determined by attained 2.5 W/m²°C and 0.5 W/m²°C respectively for SSDHP and SSD models.

Keywords: productivity, efficiency, convective heat coefficient, SSD model, SSDHPmodel

Procedia PDF Downloads 188
7659 Fast Bayesian Inference of Multivariate Block-Nearest Neighbor Gaussian Process (NNGP) Models for Large Data

Authors: Carlos Gonzales, Zaida Quiroz, Marcos Prates

Abstract:

Several spatial variables collected at the same location that share a common spatial distribution can be modeled simultaneously through a multivariate geostatistical model that takes into account the correlation between these variables and the spatial autocorrelation. The main goal of this model is to perform spatial prediction of these variables in the region of study. Here we focus on a geostatistical multivariate formulation that relies on sharing common spatial random effect terms. In particular, the first response variable can be modeled by a mean that incorporates a shared random spatial effect, while the other response variables depend on this shared spatial term, in addition to specific random spatial effects. Each spatial random effect is defined through a Gaussian process with a valid covariance function, but in order to improve the computational efficiency when the data are large, each Gaussian process is approximated to a Gaussian random Markov field (GRMF), specifically to the block nearest neighbor Gaussian process (Block-NNGP). This approach involves dividing the spatial domain into several dependent blocks under certain constraints, where the cross blocks allow capturing the spatial dependence on a large scale, while each individual block captures the spatial dependence on a smaller scale. The multivariate geostatistical model belongs to the class of Latent Gaussian Models; thus, to achieve fast Bayesian inference, it is used the integrated nested Laplace approximation (INLA) method. The good performance of the proposed model is shown through simulations and applications for massive data.

Keywords: Block-NNGP, geostatistics, gaussian process, GRMF, INLA, multivariate models.

Procedia PDF Downloads 63
7658 Integrated Models of Reading Comprehension: Understanding to Impact Teaching—The Teacher’s Central Role

Authors: Sally A. Brown

Abstract:

Over the last 30 years, researchers have developed models or frameworks to provide a more structured understanding of the reading comprehension process. Cognitive information processing models and social cognitive theories both provide frameworks to inform reading comprehension instruction. The purpose of this paper is to (a) provide an overview of the historical development of reading comprehension theory, (b) review the literature framed by cognitive information processing, social cognitive, and integrated reading comprehension theories, and (c) demonstrate how these frameworks inform instruction. As integrated models of reading can guide the interpretation of various factors related to student learning, an integrated framework designed by the researcher will be presented. Results indicated that features of cognitive processing and social cognitivism theory—represented in the integrated framework—highlight the importance of the role of the teacher. This model can aid teachers in not only improving reading comprehension instruction but in identifying areas of challenge for students.

Keywords: explicit instruction, integrated models of reading comprehension, reading comprehension, teacher’s role

Procedia PDF Downloads 58
7657 Performance Comparison of Deep Convolutional Neural Networks for Binary Classification of Fine-Grained Leaf Images

Authors: Kamal KC, Zhendong Yin, Dasen Li, Zhilu Wu

Abstract:

Intra-plant disease classification based on leaf images is a challenging computer vision task due to similarities in texture, color, and shape of leaves with a slight variation of leaf spot; and external environmental changes such as lighting and background noises. Deep convolutional neural network (DCNN) has proven to be an effective tool for binary classification. In this paper, two methods for binary classification of diseased plant leaves using DCNN are presented; model created from scratch and transfer learning. Our main contribution is a thorough evaluation of 4 networks created from scratch and transfer learning of 5 pre-trained models. Training and testing of these models were performed on a plant leaf images dataset belonging to 16 distinct classes, containing a total of 22,265 images from 8 different plants, consisting of a pair of healthy and diseased leaves. We introduce a deep CNN model, Optimized MobileNet. This model with depthwise separable CNN as a building block attained an average test accuracy of 99.77%. We also present a fine-tuning method by introducing the concept of a convolutional block, which is a collection of different deep neural layers. Fine-tuned models proved to be efficient in terms of accuracy and computational cost. Fine-tuned MobileNet achieved an average test accuracy of 99.89% on 8 pairs of [healthy, diseased] leaf ImageSet.

Keywords: deep convolution neural network, depthwise separable convolution, fine-grained classification, MobileNet, plant disease, transfer learning

Procedia PDF Downloads 157
7656 Hydrological Modeling of Watersheds Using the Only Corresponding Competitor Method: The Case of M’Zab Basin, South East Algeria

Authors: Oulad Naoui Noureddine, Cherif ELAmine, Djehiche Abdelkader

Abstract:

Water resources management includes several disciplines; the modeling of rainfall-runoff relationship is the most important discipline to prevent natural risks. There are several models to study rainfall-runoff relationship in watersheds. However, the majority of these models are not applicable in all basins of the world.  In this study, a new stochastic method called The Only Corresponding Competitor method (OCC) was used for the hydrological modeling of M’ZAB   Watershed (South East of Algeria) to adapt a few empirical models for any hydrological regime.  The results obtained allow to authorize a certain number of visions, in which it would be interesting to experiment with hydrological models that improve collectively or separately the data of a catchment by the OCC method.

Keywords: modelling, optimization, rainfall-runoff relationship, empirical model, OCC

Procedia PDF Downloads 238
7655 A Subband BSS Structure with Reduced Complexity and Fast Convergence

Authors: Salah Al-Din I. Badran, Samad Ahmadi, Ismail Shahin

Abstract:

A blind source separation method is proposed; in this method, we use a non-uniform filter bank and a novel normalisation. This method provides a reduced computational complexity and increased convergence speed comparing to the full-band algorithm. Recently, adaptive sub-band scheme has been recommended to solve two problems: reduction of computational complexity and increase the convergence speed of the adaptive algorithm for correlated input signals. In this work, the reduction in computational complexity is achieved with the use of adaptive filters of orders less than the full-band adaptive filters, which operate at a sampling rate lower than the sampling rate of the input signal. The decomposed signals by analysis bank filter are less correlated in each subband than the input signal at full bandwidth, and can promote better rates of convergence.

Keywords: blind source separation, computational complexity, subband, convergence speed, mixture

Procedia PDF Downloads 551
7654 Lumped Parameter Models for Numerical Simulation of The Dynamic Response of Hoisting Appliances

Authors: Candida Petrogalli, Giovanni Incerti, Luigi Solazzi

Abstract:

This paper describes three lumped parameters models for the study of the dynamic behaviour of a boom crane. The models proposed here allow evaluating the fluctuations of the load arising from the rope and structure elasticity and from the type of the motion command imposed by the winch. A calculation software was developed in order to determine the actual acceleration of the lifted mass and the dynamic overload during the lifting phase. Some application examples are presented, with the aim of showing the correlation between the magnitude of the stress and the type of the employed motion command.

Keywords: crane, dynamic model, overloading condition, vibration

Procedia PDF Downloads 546
7653 Study of the Design and Simulation Work for an Artificial Heart

Authors: Mohammed Eltayeb Salih Elamin

Abstract:

This study discusses the concept of the artificial heart using engineering concepts, of the fluid mechanics and the characteristics of the non-Newtonian fluid. For the purpose to serve heart patients and improve aspects of their lives and since the Statistics review according to world health organization (WHO) says that heart disease and blood vessels are the first cause of death in the world. Statistics shows that 30% of the death cases in the world by the heart disease, so simply we can consider it as the number one leading cause of death in the entire world is heart failure. And since the heart implantation become a very difficult and not always available, the idea of the artificial heart become very essential. So it’s important that we participate in the developing this idea by searching and finding the weakness point in the earlier designs and hoping for improving it for the best of humanity. In this study a pump was designed in order to pump blood to the human body and taking into account all the factors that allows it to replace the human heart, in order to work at the same characteristics and the efficiency of the human heart. The pump was designed on the idea of the diaphragm pump. Three models of blood obtained from the blood real characteristics and all of these models were simulated in order to study the effect of the pumping work on the fluid. After that, we study the properties of this pump by using Ansys15 software to simulate blood flow inside the pump and the amount of stress that it will go under. The 3D geometries modeling was done using SOLID WORKS and the geometries then imported to Ansys design modeler which is used during the pre-processing procedure. The solver used throughout the study is Ansys FLUENT. This is a tool used to analysis the fluid flow troubles and the general well-known term used for this branch of science is known as Computational Fluid Dynamics (CFD). Basically, Design Modeler used during the pre-processing procedure which is a crucial step before the start of the fluid flow problem. Some of the key operations are the geometry creations which specify the domain of the fluid flow problem. Next is mesh generation which means discretization of the domain to solve governing equations at each cell and later, specify the boundary zones to apply boundary conditions for the problem. Finally, the pre–processed work will be saved at the Ansys workbench for future work continuation.

Keywords: Artificial heart, computational fluid dynamic heart chamber, design, pump

Procedia PDF Downloads 436
7652 Virtual Assessment of Measurement Error in the Fractional Flow Reserve

Authors: Keltoum Chahour, Mickael Binois

Abstract:

Due to a lack of standardization during the invasive fractional flow reserve (FFR) procedure, the index is subject to many sources of uncertainties. In this paper, we investigate -through simulation- the effect of the (FFR) device position and configuration on the obtained value of the (FFR) fraction. For this purpose, we use computational fluid dynamics (CFD) in a 3D domain corresponding to a diseased arterial portion. The (FFR) pressure captor is introduced inside it with a given length and coefficient of bending to capture the (FFR) value. To get over the computational limitations, basically, the time of the simulation is about 2h 15min for one (FFR) value; we generate a Gaussian Process (GP) model for (FFR) prediction. The (GP) model indicates good accuracy and demonstrates the effective error in the measurement created by the random configuration of the pressure captor.

Keywords: fractional flow reserve, Gaussian processes, computational fluid dynamics, drift

Procedia PDF Downloads 98
7651 Advances in Artificial intelligence Using Speech Recognition

Authors: Khaled M. Alhawiti

Abstract:

This research study aims to present a retrospective study about speech recognition systems and artificial intelligence. Speech recognition has become one of the widely used technologies, as it offers great opportunity to interact and communicate with automated machines. Precisely, it can be affirmed that speech recognition facilitates its users and helps them to perform their daily routine tasks, in a more convenient and effective manner. This research intends to present the illustration of recent technological advancements, which are associated with artificial intelligence. Recent researches have revealed the fact that speech recognition is found to be the utmost issue, which affects the decoding of speech. In order to overcome these issues, different statistical models were developed by the researchers. Some of the most prominent statistical models include acoustic model (AM), language model (LM), lexicon model, and hidden Markov models (HMM). The research will help in understanding all of these statistical models of speech recognition. Researchers have also formulated different decoding methods, which are being utilized for realistic decoding tasks and constrained artificial languages. These decoding methods include pattern recognition, acoustic phonetic, and artificial intelligence. It has been recognized that artificial intelligence is the most efficient and reliable methods, which are being used in speech recognition.

Keywords: speech recognition, acoustic phonetic, artificial intelligence, hidden markov models (HMM), statistical models of speech recognition, human machine performance

Procedia PDF Downloads 445
7650 Topology Enhancement of a Straight Fin Using a Porous Media Computational Fluid Dynamics Simulation Approach

Authors: S. Wakim, M. Nemer, B. Zeghondy, B. Ghannam, C. Bouallou

Abstract:

Designing the optimal heat exchanger is still an essential objective to be achieved. Parametrical optimization involves the evaluation of the heat exchanger dimensions to find those that best satisfy certain objectives. This method contributes to an enhanced design rather than an optimized one. On the contrary, topology optimization finds the optimal structure that satisfies the design objectives. The huge development in metal additive manufacturing allowed topology optimization to find its way into engineering applications especially in the aerospace field to optimize metal structures. Using topology optimization in 3d heat and mass transfer problems requires huge computational time, therefore coupling it with CFD simulations can reduce this it. However, existed CFD models cannot be coupled with topology optimization. The CFD model must allow creating a uniform mesh despite the initial geometry complexity and also to swap the cells from fluid to solid and vice versa. In this paper, a porous media approach compatible with topology optimization criteria is developed. It consists of modeling the fluid region of the heat exchanger as porous media having high porosity and similarly the solid region is modeled as porous media having low porosity. The switching from fluid to solid cells required by topology optimization is simply done by changing each cell porosity using a user defined function. This model is tested on a plate and fin heat exchanger and validated by comparing its results to experimental data and simulations results. Furthermore, this model is used to perform a material reallocation based on local criteria to optimize a plate and fin heat exchanger under a constant heat duty constraint. The optimized fin uses 20% fewer materials than the first while the pressure drop is reduced by about 13%.

Keywords: computational methods, finite element method, heat exchanger, porous media, topology optimization

Procedia PDF Downloads 130