Search results for: Bayesian multilevel modeling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4163

Search results for: Bayesian multilevel modeling

3983 Generational Differences in Leadership and Motivation: A Multilevel Study of Federal Workers

Authors: Sally Selden, Jyoti Aggarwal

Abstract:

The research on generational expectations about leadership is developing, but little scholarship exists on this topic for public sector organizations. Given the size of the federal workforce, this research study fills an important gap in the knowledge base and will inform public organizations how to approach managing and leading a multigenerational workforce. The research objectives of this study are to explore leadership preferences and motivation within generations and to determine whether these qualities differ by type of federal agency (e.g., law enforcement, human services, etc.). This paper will review the research on generational differences, expectations, and leadership with a focus on studies of public organizations. Using hierarchical linear modeling (HLM), this study will examine how leadership and motivation vary by generation in the federal government workforce, controlling for other demographic characteristics. The study will also examine whether generational differences impact satisfaction and performance. The study will utilize the 2019 Federal Employee Viewpoint Survey.

Keywords: multigenerational workforce, leadership, generational differences, federal workforce

Procedia PDF Downloads 193
3982 Reinventing Urban Governance: Sustainable Transport Solutions for Mitigating Climate Risks in Smart Cities

Authors: Jaqueline Nichi, Leila Da Costa Ferreira, Fabiana Barbi Seleguim, Gabriela Marques Di Giulio, Mariana Barbieri

Abstract:

The transport sector is responsible for approximately 55% of global greenhouse gas (GHG) emissions, in addition to pollution and other negative externalities, such as road accidents and congestion, that impact the routine of those who live in large cities. The objective of this article is to discuss the application and use of distinct mobility technologies such as climate adaptation and mitigation measures in the context of smart cities in the Global South. The documentary analysis is associated with 22 semi structured interviews with managers who work with mobility technologies in the public and private sectors and in civil society organizations to explore solutions in multilevel governance for smart and low-carbon mobility based on the case study from the city of São Paulo, Brazil. The hypothesis that innovation and technology to mitigate and adapt to climate impacts are not yet sufficient to make mobility more sustainable has been confirmed. The results indicate four relevant aspects for advancing a climate agenda in smart cities: integrated planning, coproduction of knowledge, experiments in governance, and new means of financing to guarantee the sustainable sociotechnical transition of the sector.

Keywords: urban mobility, climate change, smart cities, multilevel governance

Procedia PDF Downloads 21
3981 The Persistence of Abnormal Return on Assets: An Exploratory Analysis of the Differences between Industries and Differences between Firms by Country and Sector

Authors: José Luis Gallizo, Pilar Gargallo, Ramon Saladrigues, Manuel Salvador

Abstract:

This study offers an exploratory statistical analysis of the persistence of annual profits across a sample of firms from different European Union (EU) countries. To this end, a hierarchical Bayesian dynamic model has been used which enables the annual behaviour of those profits to be broken down into a permanent structural and a transitory component, while also distinguishing between general effects affecting the industry as a whole to which each firm belongs and specific effects affecting each firm in particular. This breakdown enables the relative importance of those fundamental components to be more accurately evaluated by country and sector. Furthermore, Bayesian approach allows for testing different hypotheses about the homogeneity of the behaviour of the above components with respect to the sector and the country where the firm develops its activity. The data analysed come from a sample of 23,293 firms in EU countries selected from the AMADEUS data-base. The period analysed ran from 1999 to 2007 and 21 sectors were analysed, chosen in such a way that there was a sufficiently large number of firms in each country sector combination for the industry effects to be estimated accurately enough for meaningful comparisons to be made by sector and country. The analysis has been conducted by sector and by country from a Bayesian perspective, thus making the study more flexible and realistic since the estimates obtained do not depend on asymptotic results. In general terms, the study finds that, although the industry effects are significant, more important are the firm specific effects. That importance varies depending on the sector or the country in which the firm carries out its activity. The influence of firm effects accounts for around 81% of total variation and display a significantly lower degree of persistence, with adjustment speeds oscillating around 34%. However, this pattern is not homogeneous but depends on the sector and country analysed. Industry effects depends also on sector and country analysed have a more marginal importance, being significantly more persistent, with adjustment speeds oscillating around 7-8% with this degree of persistence being very similar for most of sectors and countries analysed.

Keywords: dynamic models, Bayesian inference, MCMC, abnormal returns, persistence of profits, return on assets

Procedia PDF Downloads 376
3980 Application of Bayesian Model Averaging and Geostatistical Output Perturbation to Generate Calibrated Ensemble Weather Forecast

Authors: Muhammad Luthfi, Sutikno Sutikno, Purhadi Purhadi

Abstract:

Weather forecast has necessarily been improved to provide the communities an accurate and objective prediction as well. To overcome such issue, the numerical-based weather forecast was extensively developed to reduce the subjectivity of forecast. Yet the Numerical Weather Predictions (NWPs) outputs are unfortunately issued without taking dynamical weather behavior and local terrain features into account. Thus, NWPs outputs are not able to accurately forecast the weather quantities, particularly for medium and long range forecast. The aim of this research is to aid and extend the development of ensemble forecast for Meteorology, Climatology, and Geophysics Agency of Indonesia. Ensemble method is an approach combining various deterministic forecast to produce more reliable one. However, such forecast is biased and uncalibrated due to its underdispersive or overdispersive nature. As one of the parametric methods, Bayesian Model Averaging (BMA) generates the calibrated ensemble forecast and constructs predictive PDF for specified period. Such method is able to utilize ensemble of any size but does not take spatial correlation into account. Whereas space dependencies involve the site of interest and nearby site, influenced by dynamic weather behavior. Meanwhile, Geostatistical Output Perturbation (GOP) reckons the spatial correlation to generate future weather quantities, though merely built by a single deterministic forecast, and is able to generate an ensemble of any size as well. This research conducts both BMA and GOP to generate the calibrated ensemble forecast for the daily temperature at few meteorological sites nearby Indonesia international airport.

Keywords: Bayesian Model Averaging, ensemble forecast, geostatistical output perturbation, numerical weather prediction, temperature

Procedia PDF Downloads 249
3979 Enhancing Predictive Accuracy in Pharmaceutical Sales through an Ensemble Kernel Gaussian Process Regression Approach

Authors: Shahin Mirshekari, Mohammadreza Moradi, Hossein Jafari, Mehdi Jafari, Mohammad Ensaf

Abstract:

This research employs Gaussian Process Regression (GPR) with an ensemble kernel, integrating Exponential Squared, Revised Matern, and Rational Quadratic kernels to analyze pharmaceutical sales data. Bayesian optimization was used to identify optimal kernel weights: 0.76 for Exponential Squared, 0.21 for Revised Matern, and 0.13 for Rational Quadratic. The ensemble kernel demonstrated superior performance in predictive accuracy, achieving an R² score near 1.0, and significantly lower values in MSE, MAE, and RMSE. These findings highlight the efficacy of ensemble kernels in GPR for predictive analytics in complex pharmaceutical sales datasets.

Keywords: Gaussian process regression, ensemble kernels, bayesian optimization, pharmaceutical sales analysis, time series forecasting, data analysis

Procedia PDF Downloads 29
3978 One-off Separation of Multiple Types of Oil-in-Water Emulsions with Surface-Engineered Graphene-Based Multilevel Structure Materials

Authors: Han Longxiang

Abstract:

In the process of treating industrial oil wastewater with complex components, the traditional treatment methods (flotation, coagulation, microwave heating, etc.) often produce high operating costs, secondary pollution, and other problems. In order to solve these problems, the materials with high flux and stability applied to surfactant-stabilized emulsions separation have gained huge attention in the treatment of oily wastewater. Nevertheless, four stable oil-in-water emulsions can be formed due to different surfactants (surfactant-free, anionic surfactant, cationic surfactant, and non-ionic surfactant), and the previous advanced materials can only separate one or several of them, cannot effectively separate in one step. Herein, a facile synthesis method of graphene-based multilevel filter materials (GMFM) can efficiently separate the oil-in-water emulsions stabilized with different surfactants only through its gravity. The prepared materials with high stability of 20 cycles show a high flux of ~ 5000 L m-2 h-1 with a high separation efficiency of > 99.9 %. GMFM can effectively separate the emulsion stabilized by mixed surfactants and oily wastewater from factories. The results indicate that the GMFM has a wide range of applications in oil-in-water emulsions separation in industry and environmental science.

Keywords: emulsion, filtration, graphene, one-step

Procedia PDF Downloads 51
3977 Don't Just Guess and Slip: Estimating Bayesian Knowledge Tracing Parameters When Observations Are Scant

Authors: Michael Smalenberger

Abstract:

Intelligent tutoring systems (ITS) are computer-based platforms which can incorporate artificial intelligence to provide step-by-step guidance as students practice problem-solving skills. ITS can replicate and even exceed some benefits of one-on-one tutoring, foster transactivity in collaborative environments, and lead to substantial learning gains when used to supplement the instruction of a teacher or when used as the sole method of instruction. A common facet of many ITS is their use of Bayesian Knowledge Tracing (BKT) to estimate parameters necessary for the implementation of the artificial intelligence component, and for the probability of mastery of a knowledge component relevant to the ITS. While various techniques exist to estimate these parameters and probability of mastery, none directly and reliably ask the user to self-assess these. In this study, 111 undergraduate students used an ITS in a college-level introductory statistics course for which detailed transaction-level observations were recorded, and users were also routinely asked direct questions that would lead to such a self-assessment. Comparisons were made between these self-assessed values and those obtained using commonly used estimation techniques. Our findings show that such self-assessments are particularly relevant at the early stages of ITS usage while transaction level data are scant. Once a user’s transaction level data become available after sufficient ITS usage, these can replace the self-assessments in order to eliminate the identifiability problem in BKT. We discuss how these findings are relevant to the number of exercises necessary to lead to mastery of a knowledge component, the associated implications on learning curves, and its relevance to instruction time.

Keywords: Bayesian Knowledge Tracing, Intelligent Tutoring System, in vivo study, parameter estimation

Procedia PDF Downloads 146
3976 Breast Cancer Detection Using Machine Learning Algorithms

Authors: Jiwan Kumar, Pooja, Sandeep Negi, Anjum Rouf, Amit Kumar, Naveen Lakra

Abstract:

In modern times where, health issues are increasing day by day, breast cancer is also one of them, which is very crucial and really important to find in the early stages. Doctors can use this model in order to tell their patients whether a cancer is not harmful (benign) or harmful (malignant). We have used the knowledge of machine learning in order to produce the model. we have used algorithms like Logistic Regression, Random forest, support Vector Classifier, Bayesian Network and Radial Basis Function. We tried to use the data of crucial parts and show them the results in pictures in order to make it easier for doctors. By doing this, we're making ML better at finding breast cancer, which can lead to saving more lives and better health care.

Keywords: Bayesian network, radial basis function, ensemble learning, understandable, data making better, random forest, logistic regression, breast cancer

Procedia PDF Downloads 16
3975 Methodologies, Systems Development Life Cycle and Modeling Languages in Agile Software Development

Authors: I. D. Arroyo

Abstract:

This article seeks to integrate different concepts from contemporary software engineering with an agile development approach. We seek to clarify some definitions and uses, we make a difference between the Systems Development Life Cycle (SDLC) and the methodologies, we differentiate the types of frameworks such as methodological, philosophical and behavioral, standards and documentation. We define relationships based on the documentation of the development process through formal and ad hoc models, and we define the usefulness of using DevOps and Agile Modeling as integrative methodologies of principles and best practices.

Keywords: methodologies, modeling languages, agile modeling, UML

Procedia PDF Downloads 153
3974 Electricity Demand Modeling and Forecasting in Singapore

Authors: Xian Li, Qing-Guo Wang, Jiangshuai Huang, Jidong Liu, Ming Yu, Tan Kok Poh

Abstract:

In power industry, accurate electricity demand forecasting for a certain leading time is important for system operation and control, etc. In this paper, we investigate the modeling and forecasting of Singapore’s electricity demand. Several standard models, such as HWT exponential smoothing model, the ARMA model and the ANNs model have been proposed based on historical demand data. We applied them to Singapore electricity market and proposed three refinements based on simulation to improve the modeling accuracy. Compared with existing models, our refined model can produce better forecasting accuracy. It is demonstrated in the simulation that by adding forecasting error into the forecasting equation, the modeling accuracy could be improved greatly.

Keywords: power industry, electricity demand, modeling, forecasting

Procedia PDF Downloads 612
3973 Modeling Curriculum for High School Students to Learn about Electric Circuits

Authors: Meng-Fei Cheng, Wei-Lun Chen, Han-Chang Ma, Chi-Che Tsai

Abstract:

Recent K–12 Taiwan Science Education Curriculum Guideline emphasize the essential role of modeling curriculum in science learning; however, few modeling curricula have been designed and adopted in current science teaching. Therefore, this study aims to develop modeling curriculum on electric circuits to investigate any learning difficulties students have with modeling curriculum and further enhance modeling teaching. This study was conducted with 44 10th-grade students in Central Taiwan. Data collection included a students’ understanding of models in science (SUMS) survey that explored the students' epistemology of scientific models and modeling and a complex circuit problem to investigate the students’ modeling abilities. Data analysis included the following: (1) Paired sample t-tests were used to examine the improvement of students’ modeling abilities and conceptual understanding before and after the curriculum was taught. (2) Paired sample t-tests were also utilized to determine the students’ modeling abilities before and after the modeling activities, and a Pearson correlation was used to understand the relationship between students’ modeling abilities during the activities and on the posttest. (3) ANOVA analysis was used during different stages of the modeling curriculum to investigate the differences between the students’ who developed microscopic models and macroscopic models after the modeling curriculum was taught. (4) Independent sample t-tests were employed to determine whether the students who changed their models had significantly different understandings of scientific models than the students who did not change their models. The results revealed the following: (1) After the modeling curriculum was taught, the students had made significant progress in both their understanding of the science concept and their modeling abilities. In terms of science concepts, this modeling curriculum helped the students overcome the misconception that electric currents reduce after flowing through light bulbs. In terms of modeling abilities, this modeling curriculum helped students employ macroscopic or microscopic models to explain their observed phenomena. (2) Encouraging the students to explain scientific phenomena in different context prompts during the modeling process allowed them to convert their models to microscopic models, but it did not help them continuously employ microscopic models throughout the whole curriculum. The students finally consistently employed microscopic models when they had help visualizing the microscopic models. (3) During the modeling process, the students who revised their own models better understood that models can be changed than the students who did not revise their own models. Also, the students who revised their models to explain different scientific phenomena tended to regard models as explanatory tools. In short, this study explored different strategies to facilitate students’ modeling processes as well as their difficulties with the modeling process. The findings can be used to design and teach modeling curricula and help students enhance their modeling abilities.

Keywords: electric circuits, modeling curriculum, science learning, scientific model

Procedia PDF Downloads 430
3972 One-off Separation of Multiple Types of Oil-In-Water Emulsions With Surface-Engineered Graphene-Based Multilevel Structure Materials

Authors: Han Longxiang

Abstract:

In the process of treating industrial oily wastewater with complex components, the traditional treatment methods (flotation, coagulation, microwave heating, etc.) often produce high operating costs, secondary pollution, and other problems. In order to solve these problems, the materials with high flux and stability applied to surfactant-stabilized emulsions separation have gained huge attention in the treatment of oily wastewater. Nevertheless, four stable oil-in-water emulsions can be formed due to different surfactants (surfactant-free, anionic surfactant, cationic surfactant, and non-ionic surfactant), and the previous advanced materials can only separate one or several of them, cannot effectively separate in one step. Herein, a facile synthesis method of graphene-based multilevel filter materials (GMFM) which can efficiently separate the oil-in-water emulsions stabilized with different surfactants only through its gravity. The prepared materials with high stability of 20 cycles show a high flux of ~ 5000 L m-2 h-1 with a high separation efficiency of > 99.9 %. GMFM can effectively separate the emulsion stabilized by mixed surfactants and oily wastewater from factories. The results indicate that the GMFM have a wide range of applications in oil-in-water emulsions separation in industry and environmental science.

Keywords: emulsion, filtration, graphene, one-step

Procedia PDF Downloads 61
3971 Easymodel: Web-based Bioinformatics Software for Protein Modeling Based on Modeller

Authors: Alireza Dantism

Abstract:

Presently, describing the function of a protein sequence is one of the most common problems in biology. Usually, this problem can be facilitated by studying the three-dimensional structure of proteins. In the absence of a protein structure, comparative modeling often provides a useful three-dimensional model of the protein that is dependent on at least one known protein structure. Comparative modeling predicts the three-dimensional structure of a given protein sequence (target) mainly based on its alignment with one or more proteins of known structure (templates). Comparative modeling consists of four main steps 1. Similarity between the target sequence and at least one known template structure 2. Alignment of target sequence and template(s) 3. Build a model based on alignment with the selected template(s). 4. Prediction of model errors 5. Optimization of the built model There are many computer programs and web servers that automate the comparative modeling process. One of the most important advantages of these servers is that it makes comparative modeling available to both experts and non-experts, and they can easily do their own modeling without the need for programming knowledge, but some other experts prefer using programming knowledge and do their modeling manually because by doing this they can maximize the accuracy of their modeling. In this study, a web-based tool has been designed to predict the tertiary structure of proteins using PHP and Python programming languages. This tool is called EasyModel. EasyModel can receive, according to the user's inputs, the desired unknown sequence (which we know as the target) in this study, the protein sequence file (template), etc., which also has a percentage of similarity with the primary sequence, and its third structure Predict the unknown sequence and present the results in the form of graphs and constructed protein files.

Keywords: structural bioinformatics, protein tertiary structure prediction, modeling, comparative modeling, modeller

Procedia PDF Downloads 61
3970 Spatio-Temporal Analysis and Mapping of Malaria in Thailand

Authors: Krisada Lekdee, Sunee Sammatat, Nittaya Boonsit

Abstract:

This paper proposes a GLMM with spatial and temporal effects for malaria data in Thailand. A Bayesian method is used for parameter estimation via Gibbs sampling MCMC. A conditional autoregressive (CAR) model is assumed to present the spatial effects. The temporal correlation is presented through the covariance matrix of the random effects. The malaria quarterly data have been extracted from the Bureau of Epidemiology, Ministry of Public Health of Thailand. The factors considered are rainfall and temperature. The result shows that rainfall and temperature are positively related to the malaria morbidity rate. The posterior means of the estimated morbidity rates are used to construct the malaria maps. The top 5 highest morbidity rates (per 100,000 population) are in Trat (Q3, 111.70), Chiang Mai (Q3, 104.70), Narathiwat (Q4, 97.69), Chiang Mai (Q2, 88.51), and Chanthaburi (Q3, 86.82). According to the DIC criterion, the proposed model has a better performance than the GLMM with spatial effects but without temporal terms.

Keywords: Bayesian method, generalized linear mixed model (GLMM), malaria, spatial effects, temporal correlation

Procedia PDF Downloads 417
3969 The Strengths and Limitations of the Statistical Modeling of Complex Social Phenomenon: Focusing on SEM, Path Analysis, or Multiple Regression Models

Authors: Jihye Jeon

Abstract:

This paper analyzes the conceptual framework of three statistical methods, multiple regression, path analysis, and structural equation models. When establishing research model of the statistical modeling of complex social phenomenon, it is important to know the strengths and limitations of three statistical models. This study explored the character, strength, and limitation of each modeling and suggested some strategies for accurate explaining or predicting the causal relationships among variables. Especially, on the studying of depression or mental health, the common mistakes of research modeling were discussed.

Keywords: multiple regression, path analysis, structural equation models, statistical modeling, social and psychological phenomenon

Procedia PDF Downloads 602
3968 Development of an Automatic Calibration Framework for Hydrologic Modelling Using Approximate Bayesian Computation

Authors: A. Chowdhury, P. Egodawatta, J. M. McGree, A. Goonetilleke

Abstract:

Hydrologic models are increasingly used as tools to predict stormwater quantity and quality from urban catchments. However, due to a range of practical issues, most models produce gross errors in simulating complex hydraulic and hydrologic systems. Difficulty in finding a robust approach for model calibration is one of the main issues. Though automatic calibration techniques are available, they are rarely used in common commercial hydraulic and hydrologic modelling software e.g. MIKE URBAN. This is partly due to the need for a large number of parameters and large datasets in the calibration process. To overcome this practical issue, a framework for automatic calibration of a hydrologic model was developed in R platform and presented in this paper. The model was developed based on the time-area conceptualization. Four calibration parameters, including initial loss, reduction factor, time of concentration and time-lag were considered as the primary set of parameters. Using these parameters, automatic calibration was performed using Approximate Bayesian Computation (ABC). ABC is a simulation-based technique for performing Bayesian inference when the likelihood is intractable or computationally expensive to compute. To test the performance and usefulness, the technique was used to simulate three small catchments in Gold Coast. For comparison, simulation outcomes from the same three catchments using commercial modelling software, MIKE URBAN were used. The graphical comparison shows strong agreement of MIKE URBAN result within the upper and lower 95% credible intervals of posterior predictions as obtained via ABC. Statistical validation for posterior predictions of runoff result using coefficient of determination (CD), root mean square error (RMSE) and maximum error (ME) was found reasonable for three study catchments. The main benefit of using ABC over MIKE URBAN is that ABC provides a posterior distribution for runoff flow prediction, and therefore associated uncertainty in predictions can be obtained. In contrast, MIKE URBAN just provides a point estimate. Based on the results of the analysis, it appears as though ABC the developed framework performs well for automatic calibration.

Keywords: automatic calibration framework, approximate bayesian computation, hydrologic and hydraulic modelling, MIKE URBAN software, R platform

Procedia PDF Downloads 277
3967 Sea of Light: A Game 'Based Approach for Evidence-Centered Assessment of Collaborative Problem Solving

Authors: Svenja Pieritz, Jakab Pilaszanovich

Abstract:

Collaborative Problem Solving (CPS) is recognized as being one of the most important skills of the 21st century with having a potential impact on education, job selection, and collaborative systems design. Therefore, CPS has been adopted in several standardized tests, including the Programme for International Student Assessment (PISA) in 2015. A significant challenge of evaluating CPS is the underlying interplay of cognitive and social skills, which requires a more holistic assessment. However, the majority of the existing tests are using a questionnaire-based assessment, which oversimplifies this interplay and undermines ecological validity. Two major difficulties were identified: Firstly, the creation of a controllable, real-time environment allowing natural behaviors and communication between at least two people. Secondly, the development of an appropriate method to collect and synthesize both cognitive and social metrics of collaboration. This paper proposes a more holistic and automated approach to the assessment of CPS. To address these two difficulties, a multiplayer problem-solving game called Sea of Light was developed: An environment allowing students to deploy a variety of measurable collaborative strategies. This controlled environment enables researchers to monitor behavior through the analysis of game actions and chat. The according solution for the statistical model is a combined approach of Natural Language Processing (NLP) and Bayesian network analysis. Social exchanges via the in-game chat are analyzed through NLP and fed into the Bayesian network along with other game actions. This Bayesian network synthesizes evidence to track and update different subdimensions of CPS. Major findings focus on the correlations between the evidences collected through in- game actions, the participants’ chat features and the CPS self- evaluation metrics. These results give an indication of which game mechanics can best describe CPS evaluation. Overall, Sea of Light gives test administrators control over different problem-solving scenarios and difficulties while keeping the student engaged. It enables a more complete assessment based on complex, socio-cognitive information on actions and communication. This tool permits further investigations of the effects of group constellations and personality in collaborative problem-solving.

Keywords: bayesian network, collaborative problem solving, game-based assessment, natural language processing

Procedia PDF Downloads 110
3966 Reduced Switch Count Asymmetrical Multilevel Inverter Topology

Authors: Voodi Kalandhar, Veera Reddy, Yuva Tejasree

Abstract:

Researchers have become interested in multilevel inverters (MLI) because of their potential for medium- and high-power applications. MLIs are becoming more popular as a result of their ability to generate higher voltage levels, minimal power losses, small size, and low price. These inverters used in high voltage and high-power applications because the stress on the switch is low. Even though many traditional topologies, such as the cascaded H-bridge MLI, the flying capacitor MLI, and the diode clamped MLI, exist, they all have some drawbacks. A complicated control system is needed for the flying capacitor MLI to balance the voltage across the capacitor and diode clamped MLI requires more no of diodes when no of levels increases. Even though the cascaded H-Bridge MLI is popular in terms of modularity and simple control, it requires more no of isolated DC source. Therefore, a topology with fewer devices has always been necessary for greater efficiency and reliability. A new single-phase MLI topology has been introduced to minimize the required switch count in the circuit and increase output levels. With 3 dc voltage sources, 8 switches, and 13 levels at the output, this new single- phase MLI topology was developed. To demonstrate the proposed converter's superiority over the other MLI topologies currently in use, a thorough analysis of the proposed topology will be conducted.

Keywords: DC-AC converter, multi-level inverter (MLI), diodes, H-bridge inverter, switches

Procedia PDF Downloads 53
3965 Application of Low-order Modeling Techniques and Neural-Network Based Models for System Identification

Authors: Venkatesh Pulletikurthi, Karthik B. Ariyur, Luciano Castillo

Abstract:

The system identification from the turbulence wakes will lead to the tactical advantage to prepare and also, to predict the trajectory of the opponents’ movements. A low-order modeling technique, POD, is used to predict the object based on the wake pattern and compared with pre-trained image recognition neural network (NN) to classify the wake patterns into objects. It is demonstrated that low-order modeling, POD, is able to predict the objects better compared to pretrained NN by ~30%.

Keywords: the bluff body wakes, low-order modeling, neural network, system identification

Procedia PDF Downloads 148
3964 Turkey Disaster Risk Management System Project (TAFRISK)

Authors: Ahmet Parlak, Celalettin Bilgen

Abstract:

In order to create an effective early warning system, Identification of the risks, preparation and carrying out risk modeling of risk scenarios, taking into account the shortcomings of the old disaster scenarios should be used to improve the system. In the light of this, the importance of risk modeling in creating an effective early warning system is understood. In the scope of TAFRISK project risk modeling trend analysis report on risk modeling developed and a demonstration was conducted for Risk Modeling for flood and mass movements. For risk modeling R&D, studies have been conducted to determine the information, and source of the information, to be gathered, to develop algorithms and to adapt the current algorithms to Turkey’s conditions for determining the risk score in the high disaster risk areas. For each type of the disaster; Disaster Deficit Index (DDI), Local Disaster Index (LDI), Prevalent Vulnerability Index (PVI), Risk Management Index (RMI) have been developed as disaster indices taking danger, sensitivity, fragility, and vulnerability, the physical and economic damage into account in the appropriate scale of the respective type.

Keywords: disaster, hazard, risk modeling, sensor

Procedia PDF Downloads 396
3963 Bayesian System and Copula for Event Detection and Summarization of Soccer Videos

Authors: Dhanuja S. Patil, Sanjay B. Waykar

Abstract:

Event detection is a standout amongst the most key parts for distinctive sorts of area applications of video data framework. Recently, it has picked up an extensive interest of experts and in scholastics from different zones. While detecting video event has been the subject of broad study efforts recently, impressively less existing methodology has considered multi-model data and issues related efficiency. Start of soccer matches different doubtful circumstances rise that can't be effectively judged by the referee committee. A framework that checks objectively image arrangements would prevent not right interpretations because of some errors, or high velocity of the events. Bayesian networks give a structure for dealing with this vulnerability using an essential graphical structure likewise the probability analytics. We propose an efficient structure for analysing and summarization of soccer videos utilizing object-based features. The proposed work utilizes the t-cherry junction tree, an exceptionally recent advancement in probabilistic graphical models, to create a compact representation and great approximation intractable model for client’s relationships in an interpersonal organization. There are various advantages in this approach firstly; the t-cherry gives best approximation by means of junction trees class. Secondly, to construct a t-cherry junction tree can be to a great extent parallelized; and at last inference can be performed utilizing distributed computation. Examination results demonstrates the effectiveness, adequacy, and the strength of the proposed work which is shown over a far reaching information set, comprising more soccer feature, caught at better places.

Keywords: summarization, detection, Bayesian network, t-cherry tree

Procedia PDF Downloads 294
3962 Integrating Time-Series and High-Spatial Remote Sensing Data Based on Multilevel Decision Fusion

Authors: Xudong Guan, Ainong Li, Gaohuan Liu, Chong Huang, Wei Zhao

Abstract:

Due to the low spatial resolution of MODIS data, the accuracy of small-area plaque extraction with a high degree of landscape fragmentation is greatly limited. To this end, the study combines Landsat data with higher spatial resolution and MODIS data with higher temporal resolution for decision-level fusion. Considering the importance of the land heterogeneity factor in the fusion process, it is superimposed with the weighting factor, which is to linearly weight the Landsat classification result and the MOIDS classification result. Three levels were used to complete the process of data fusion, that is the pixel of MODIS data, the pixel of Landsat data, and objects level that connect between these two levels. The multilevel decision fusion scheme was tested in two sites of the lower Mekong basin. We put forth a comparison test, and it was proved that the classification accuracy was improved compared with the single data source classification results in terms of the overall accuracy. The method was also compared with the two-level combination results and a weighted sum decision rule-based approach. The decision fusion scheme is extensible to other multi-resolution data decision fusion applications.

Keywords: image classification, decision fusion, multi-temporal, remote sensing

Procedia PDF Downloads 97
3961 Multinomial Dirichlet Gaussian Process Model for Classification of Multidimensional Data

Authors: Wanhyun Cho, Soonja Kang, Sanggoon Kim, Soonyoung Park

Abstract:

We present probabilistic multinomial Dirichlet classification model for multidimensional data and Gaussian process priors. Here, we have considered an efficient computational method that can be used to obtain the approximate posteriors for latent variables and parameters needed to define the multiclass Gaussian process classification model. We first investigated the process of inducing a posterior distribution for various parameters and latent function by using the variational Bayesian approximations and important sampling method, and next we derived a predictive distribution of latent function needed to classify new samples. The proposed model is applied to classify the synthetic multivariate dataset in order to verify the performance of our model. Experiment result shows that our model is more accurate than the other approximation methods.

Keywords: multinomial dirichlet classification model, Gaussian process priors, variational Bayesian approximation, importance sampling, approximate posterior distribution, marginal likelihood evidence

Procedia PDF Downloads 409
3960 Diagnostic Assessment for Mastery Learning of Engineering Students with a Bayesian Network Model

Authors: Zhidong Zhang, Yingchen Yang

Abstract:

In this study, a diagnostic assessment model for Mastery Engineering Learning was established based on a group of undergraduate students who studied in an engineering course. A diagnostic assessment model can examine both students' learning process and report achievement results. One very unique characteristic is that the diagnostic assessment model can recognize the errors and anything blocking students in their learning processes. The feedback is provided to help students to know how to solve the learning problems with alternative strategies and help the instructor to find alternative pedagogical strategies in the instructional designs. Dynamics is a core course in which is a common course being shared by several engineering programs. This course is a very challenging for engineering students to solve the problems. Thus knowledge acquisition and problem-solving skills are crucial for student success. Therefore, developing an effective and valid assessment model for student learning are of great importance. Diagnostic assessment is such a model which can provide effective feedback for both students and instructor in the mastery of engineering learning.

Keywords: diagnostic assessment, mastery learning, engineering, bayesian network model, learning processes

Procedia PDF Downloads 128
3959 Modelling and Simulation of the Freezing Systems and Heat Pumps Using Unisim® Design

Authors: C. Patrascioiu

Abstract:

The paper describes the modeling and simulation of the heat pumps domain processes. The main objective of the study is the use of the heat pump in propene–propane distillation processes. The modeling and simulation instrument is the Unisim® Design simulator. The paper is structured in three parts: An overview of the compressing gases, the modeling and simulation of the freezing systems, and the modeling and simulation of the heat pumps. For each of these systems, there are presented the Unisim® Design simulation diagrams, the input–output system structure and the numerical results. Future studies will consider modeling and simulation of the propene–propane distillation process with heat pump.

Keywords: distillation, heat pump, simulation, unisim design

Procedia PDF Downloads 334
3958 A Time-Varying and Non-Stationary Convolution Spectral Mixture Kernel for Gaussian Process

Authors: Kai Chen, Shuguang Cui, Feng Yin

Abstract:

Gaussian process (GP) with spectral mixture (SM) kernel demonstrates flexible non-parametric Bayesian learning ability in modeling unknown function. In this work a novel time-varying and non-stationary convolution spectral mixture (TN-CSM) kernel with a significant enhancing of interpretability by using process convolution is introduced. A way decomposing the SM component into an auto-convolution of base SM component and parameterizing it to be input dependent is outlined. Smoothly, performing a convolution between two base SM component yields a novel structure of non-stationary SM component with much better generalized expression and interpretation. The TN-CSM perfectly allows compatibility with the stationary SM kernel in terms of kernel form and spectral base ignored and confused by previous non-stationary kernels. On synthetic and real-world datatsets, experiments show the time-varying characteristics of hyper-parameters in TN-CSM and compare the learning performance of TN-CSM with popular and representative non-stationary GP.

Keywords: Gaussian process, spectral mixture, non-stationary, convolution

Procedia PDF Downloads 164
3957 Dynamic Modeling of Energy Systems Adapted to Low Energy Buildings in Lebanon

Authors: Nadine Yehya, Chantal Maatouk

Abstract:

Low energy buildings have been developed to achieve global climate commitments in reducing energy consumption. They comprise energy efficient buildings, zero energy buildings, positive buildings and passive house buildings. The reduced energy demands in Low Energy buildings call for advanced building energy modeling that focuses on studying active building systems such as heating, cooling and ventilation, improvement of systems performances, and development of control systems. Modeling and building simulation have expanded to cover different modeling approach i.e.: detailed physical model, dynamic empirical models, and hybrid approaches, which are adopted by various simulation tools. This paper uses DesignBuilder with EnergyPlus simulation engine in order to; First, study the impact of efficiency measures on building energy behavior by comparing Low energy residential model to a conventional one in Beirut-Lebanon. Second, choose the appropriate energy systems for the studied case characterized by an important cooling demand. Third, study dynamic modeling of Variable Refrigerant Flow (VRF) system in EnergyPlus that is chosen due to its advantages over other systems and its availability in the Lebanese market. Finally, simulation of different energy systems models with different modeling approaches is necessary to confront the different modeling approaches and to investigate the interaction between energy systems and building envelope that affects the total energy consumption of Low Energy buildings.

Keywords: physical model, variable refrigerant flow heat pump, dynamic modeling, EnergyPlus, the modeling approach

Procedia PDF Downloads 185
3956 Clinical Efficacy of Indigenous Software for Automatic Detection of Stages of Retinopathy of Prematurity (ROP)

Authors: Joshi Manisha, Shivaram, Anand Vinekar, Tanya Susan Mathews, Yeshaswini Nagaraj

Abstract:

Retinopathy of prematurity (ROP) is abnormal blood vessel development in the retina of the eye in a premature infant. The principal object of the invention is to provide a technique for detecting demarcation line and ridge detection for a given ROP image that facilitates early detection of ROP in stage 1 and stage 2. The demarcation line is an indicator of Stage 1 of the ROP and the ridge is the hallmark of typically Stage 2 ROP. Thirty Retcam images of Asian Indian infants obtained during routine ROP screening have been used for the analysis. A graphical user interface has been developed to detect demarcation line/ridge and to extract ground truth. This novel algorithm uses multilevel vessel enhancement to enhance tubular structures in the digital ROP images. It has been observed that the orientation of the demarcation line/ridge is normal to the direction of the blood vessels, which is used for the identification of the ridge/ demarcation line. Quantitative analysis has been presented based on gold standard images marked by expert ophthalmologist. Image based analysis has been based on the length and the position of the detected ridge. In image based evaluation, average sensitivity and positive predictive value was found to be 92.30% and 85.71% respectively. In pixel based evaluation, average sensitivity, specificity, positive predictive value and negative predictive value achieved were 60.38%, 99.66%, 52.77% and 99.75% respectively.

Keywords: ROP, ridge, multilevel vessel enhancement, biomedical

Procedia PDF Downloads 364
3955 A Hybrid Fuzzy Clustering Approach for Fertile and Unfertile Analysis

Authors: Shima Soltanzadeh, Mohammad Hosain Fazel Zarandi, Mojtaba Barzegar Astanjin

Abstract:

Diagnosis of male infertility by the laboratory tests is expensive and, sometimes it is intolerable for patients. Filling out the questionnaire and then using classification method can be the first step in decision-making process, so only in the cases with a high probability of infertility we can use the laboratory tests. In this paper, we evaluated the performance of four classification methods including naive Bayesian, neural network, logistic regression and fuzzy c-means clustering as a classification, in the diagnosis of male infertility due to environmental factors. Since the data are unbalanced, the ROC curves are most suitable method for the comparison. In this paper, we also have selected the more important features using a filtering method and examined the impact of this feature reduction on the performance of each methods; generally, most of the methods had better performance after applying the filter. We have showed that using fuzzy c-means clustering as a classification has a good performance according to the ROC curves and its performance is comparable to other classification methods like logistic regression.

Keywords: classification, fuzzy c-means, logistic regression, Naive Bayesian, neural network, ROC curve

Procedia PDF Downloads 304
3954 Power Quality Improvement Using Interval Type-2 Fuzzy Logic Controller for Five-Level Shunt Active Power Filter

Authors: Yousfi Abdelkader, Chaker Abdelkader, Bot Youcef

Abstract:

This article proposes a five-level shunt active power filter for power quality improvement using a interval type-2 fuzzy logic controller (IT2 FLC). The reference compensating current is extracted using the P-Q theory. The majority of works previously reported are based on two-level inverters with a conventional Proportional integral (PI) controller, which requires rigorous mathematical modeling of the system. In this paper, a IT2 FLC controlled five-level active power filter is proposed to overcome the problem associated with PI controller. The IT2 FLC algorithm is applied for controlling the DC-side capacitor voltage as well as the harmonic currents of the five-level active power filter. The active power filter with a IT2 FLC is simulated in MATLAB Simulink environment. The simulated response shows that the proposed shunt active power filter controller has produced a sinusoidal supply current with low harmonic distortion and in phase with the source voltage.

Keywords: power quality, shunt active power filter, interval type-2 fuzzy logic controller (T2FL), multilevel inverter

Procedia PDF Downloads 137