Search results for: model data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 35317

Search results for: model data

33637 Evaluation of the Families' Psychological Nature and the Relationship between the Academic Success According to the Students' Opinion

Authors: Sebnem Erismen, Ahmet Guneyli, Azize Ummanel

Abstract:

The purpose of this study is to explore the relationship between the students' academic success and families' psychological nature. The study based upon the quantitative research, and descriptive model is used. Relational descriptive model is used while evaluating the relation between families’ psychological nature and the academic success level of the students. A total of 523 secondary school students have participated the study. Personal Information Form, Family Structure Evaluation Form (FSEF) and School Reports were employed as the primary methods of data gathering. ANOVA and LSD Scheffe Test were used for analysing the data. Results of the study indicate that there are differences between the FSEF scores according to the students’ and teachers’ gender; however, no differences between the class level and seniority of the teachers were seen. Regarding the academic success of the students, it was seen that majority of them have high points. It was also seen that the academic success level of the students differentiates regarding to the classroom teachers’ gender and seniority. In conclusion, it was seen that there is a relation between the families’ psychological nature and students' academic success.

Keywords: families’ perceived psychological nature, academic success, families effect on the academic success, education

Procedia PDF Downloads 290
33636 Model Estimation and Error Level for Okike’s Merged Irregular Transposition Cipher

Authors: Okike Benjamin, Garba E. J. D.

Abstract:

The researcher has developed a new encryption technique known as Merged Irregular Transposition Cipher. In this cipher method of encryption, a message to be encrypted is split into parts and each part encrypted separately. Before the encrypted message is transmitted to the recipient(s), the positions of the split in the encrypted messages could be swapped to ensure more security. This work seeks to develop a model by considering the split number, S and the average number of characters per split, L as the message under consideration is split from 2 through 10. Again, after developing the model, the error level in the model would be determined.

Keywords: merged irregular transposition, error level, model estimation, message splitting

Procedia PDF Downloads 305
33635 3D Multimedia Model for Educational Design Engineering

Authors: Mohanaad Talal Shakir

Abstract:

This paper tries to propose educational design by using multimedia technology for Engineering of computer Technology, Alma'ref University College in Iraq. This paper evaluates the acceptance, cognition, and interactiveness of the proposed model by students by using the statistical relationship to determine the stage of the model. Objectives of proposed education design are to develop a user-friendly software for education purposes using multimedia technology and to develop animation for 3D model to simulate assembling and disassembling process of high-speed flow.

Keywords: CAL, multimedia, shock tunnel, interactivity, engineering education

Procedia PDF Downloads 616
33634 An Experimental Investigation into Fluid Forces on Road Vehicles in Unsteady Flows

Authors: M. Sumida, S. Morita

Abstract:

In this research, the effect of unsteady flows acting on road vehicles was experimentally investigated, using an advanced and recently introduced wind tunnel. The aims of this study were to extract the characteristics of fluid forces acting on road vehicles under unsteady wind conditions and obtain new information on drag forces in a practical on-road test. We applied pulsating wind as a representative example of the atmospheric fluctuations that vehicles encounter on the road. That is, we considered the case where the vehicles are moving at constant speed in the air, with large wind oscillations. The experimental tests were performed on the Ahmed-type test model, which is a simplified vehicle model. This model was chosen because of its simplicity and the data accumulated under steady wind conditions. The experiments were carried out with a time-averaged Reynolds number of Re = 4.16x10⁵ and a pulsation period of T = 1.5 s, with amplitude of η = 0.235. Unsteady fluid forces of drag and lift were obtained utilizing a multi-component load cell. It was observed that the unsteady aerodynamic forces differ significantly from those under steady wind conditions. They exhibit a phase shift and an enhanced response to the wind oscillations. Furthermore, their behavior depends on the slant angle of the rear shape of the model.

Keywords: Ahmed body, automotive aerodynamics, unsteady wind, wind tunnel test

Procedia PDF Downloads 291
33633 Value Chain Based New Business Opportunity

Authors: Seonjae Lee, Sungjoo Lee

Abstract:

Excavation is necessary to remain competitive in the current business environment. The company survived the rapidly changing industry conditions by adapting new business strategy and reducing technology challenges. Traditionally, the two methods are conducted excavations for new businesses. The first method is, qualitative analysis of expert opinion, which is gathered through opportunities and secondly, new technologies are discovered through quantitative data analysis of method patents. The second method increases time and cost. Patent data is restricted for use and the purpose of discovering business opportunities. This study presents the company's characteristics (sector, size, etc.), of new business opportunities in customized form by reviewing the value chain perspective and to contributing to creating new business opportunities in the proposed model. It utilizes the trademark database of the Korean Intellectual Property Office (KIPO) and proprietary company information database of the Korea Enterprise Data (KED). This data is key to discovering new business opportunities with analysis of competitors and advanced business trademarks (Module 1) and trading analysis of competitors found in the KED (Module 2).

Keywords: value chain, trademark, trading analysis, new business opportunity

Procedia PDF Downloads 370
33632 Macroeconomic Impact of Economic Growth on Unemployment: A Case of South Africa

Authors: Ashika Govender

Abstract:

This study seeks to determine whether Okun’s Law is valid for the South African economy, using time series data for the period 2004 to 2014. The data were accessed from the South African Reserve Bank and Stats SA. The stationarity of the variables was analysed by applying unit root tests via the Augmented Dickey-Fuller test (ADF), the Phillips-Perron (PP) test, and the Kwiatkowski–Phillips–Schmidt–Shin test (KPSS) test. The study used an ordinary least square (OLS) model in analysing the dynamic version of Okun’s law. The Error Correction Model (ECM) was used to analyse the short-run impact of GDP growth on unemployment, as well as the speed of adjustment. The results indicate a short run and long run relationship between unemployment rate and GDP growth rate in period 2004q1-2014q4, suggesting that Okun’s law is valid for the South African economy. With a 1 percent increase in GDP, unemployment can decrease by 0.13 percent, ceteris paribus. The research culminates in important policy recommendations, highlighting the relationship between unemployment and economic growth in the spirit of the National Development Plan.

Keywords: unemployment, economic growth, Okun's law, South Africa

Procedia PDF Downloads 269
33631 Predicting the Next Offensive Play Types will be Implemented to Maximize the Defense’s Chances of Success in the National Football League

Authors: Chris Schoborg, Morgan C. Wang

Abstract:

In the realm of the National Football League (NFL), substantial dedication of time and effort is invested by both players and coaches in meticulously analyzing the game footage of their opponents. The primary aim is to anticipate the actions of the opposing team. Defensive players and coaches are especially focused on deciphering their adversaries' intentions to effectively counter their strategies. Acquiring insights into the specific play type and its intended direction on the field would confer a significant competitive advantage. This study establishes pre-snap information as the cornerstone for predicting both the play type (e.g., deep pass, short pass, or run) and its spatial trajectory (right, left, or center). The dataset for this research spans the regular NFL season data for all 32 teams from 2013 to 2022. This dataset is acquired using the nflreadr package, which conveniently extracts play-by-play data from NFL games and imports it into the R environment as structured datasets. In this study, we employ a recently developed machine learning algorithm, XGBoost. The final predictive model achieves an impressive lift of 2.61. This signifies that the presented model is 2.61 times more effective than random guessing—a significant improvement. Such a model has the potential to markedly enhance defensive coaches' ability to formulate game plans and adequately prepare their players, thus mitigating the opposing offense's yardage and point gains.

Keywords: lift, NFL, sports analytics, XGBoost

Procedia PDF Downloads 53
33630 Designing an Agent-Based Model of SMEs to Assess Flood Response Strategies and Resilience

Authors: C. Li, G. Coates, N. Johnson, M. Mc Guinness

Abstract:

In the UK, flooding is responsible for significant losses to the economy due to the impact on businesses, the vast majority of which are Small and Medium Enterprises (SMEs). Businesses of this nature tend to lack formal plans to aid their response to and recovery from disruptive events such as flooding. This paper reports on work on how an agent-based model (ABM) is being developed based on interview data gathered from SMEs at-risk of flooding and/or have direct experience of flooding. The ABM will enable simulations to be performed allowing investigations of different response strategies which SMEs may employ to lessen the impact of flooding, thus strengthening their resilience.

Keywords: ABM, flood response, SMEs, business continuity

Procedia PDF Downloads 308
33629 Estimation of Service Quality and Its Impact on Market Share Using Business Analytics

Authors: Haritha Saranga

Abstract:

Service quality has become an important driver of competition in manufacturing industries of late, as many products are being sold in conjunction with service offerings. With increase in computational power and data capture capabilities, it has become possible to analyze and estimate various aspects of service quality at the granular level and determine their impact on business performance. In the current study context, dealer level, model-wise warranty data from one of the top two-wheeler manufacturers in India is used to estimate service quality of individual dealers and its impact on warranty related costs and sales performance. We collected primary data on warranty costs, number of complaints, monthly sales, type of quality upgrades, etc. from the two-wheeler automaker. In addition, we gathered secondary data on various regions in India, such as petrol and diesel prices, geographic and climatic conditions of various regions where the dealers are located, to control for customer usage patterns. We analyze this primary and secondary data with the help of a variety of analytics tools such as Auto-Regressive Integrated Moving Average (ARIMA), Seasonal ARIMA and ARIMAX. Study results, after controlling for a variety of factors, such as size, age, region of the dealership, and customer usage pattern, show that service quality does influence sales of the products in a significant manner. A more nuanced analysis reveals the dynamics between product quality and service quality, and how their interaction affects sales performance in the Indian two-wheeler industry context. We also provide various managerial insights using descriptive analytics and build a model that can provide sales projections using a variety of forecasting techniques.

Keywords: service quality, product quality, automobile industry, business analytics, auto-regressive integrated moving average

Procedia PDF Downloads 117
33628 A Chinese Nested Named Entity Recognition Model Based on Lexical Features

Authors: Shuo Liu, Dan Liu

Abstract:

In the field of named entity recognition, most of the research has been conducted around simple entities. However, for nested named entities, which still contain entities within entities, it has been difficult to identify them accurately due to their boundary ambiguity. In this paper, a hierarchical recognition model is constructed based on the grammatical structure and semantic features of Chinese text for boundary calculation based on lexical features. The analysis is carried out at different levels in terms of granularity, semantics, and lexicality, respectively, avoiding repetitive work to reduce computational effort and using the semantic features of words to calculate the boundaries of entities to improve the accuracy of the recognition work. The results of the experiments carried out on web-based microblogging data show that the model achieves an accuracy of 86.33% and an F1 value of 89.27% in recognizing nested named entities, making up for the shortcomings of some previous recognition models and improving the efficiency of recognition of nested named entities.

Keywords: coarse-grained, nested named entity, Chinese natural language processing, word embedding, T-SNE dimensionality reduction algorithm

Procedia PDF Downloads 124
33627 Diagnostic Assessment for Mastery Learning of Engineering Students with a Bayesian Network Model

Authors: Zhidong Zhang, Yingchen Yang

Abstract:

In this study, a diagnostic assessment model for Mastery Engineering Learning was established based on a group of undergraduate students who studied in an engineering course. A diagnostic assessment model can examine both students' learning process and report achievement results. One very unique characteristic is that the diagnostic assessment model can recognize the errors and anything blocking students in their learning processes. The feedback is provided to help students to know how to solve the learning problems with alternative strategies and help the instructor to find alternative pedagogical strategies in the instructional designs. Dynamics is a core course in which is a common course being shared by several engineering programs. This course is a very challenging for engineering students to solve the problems. Thus knowledge acquisition and problem-solving skills are crucial for student success. Therefore, developing an effective and valid assessment model for student learning are of great importance. Diagnostic assessment is such a model which can provide effective feedback for both students and instructor in the mastery of engineering learning.

Keywords: diagnostic assessment, mastery learning, engineering, bayesian network model, learning processes

Procedia PDF Downloads 150
33626 Knowledge Sharing Model Based on Individual and Organizational Factors Related to Faculty Members of University

Authors: Mitra Sadoughi

Abstract:

This study presents the knowledge-sharing model based on individual and organizational factors related to faculty members. To achieve this goal, individual and organizational factors were presented through qualitative research in the form of open codes, axial, and selective observations; then, the final model was obtained using structural equation model. Participants included 1,719 faculty members of the Azad Universities, Mazandaran Province, Region 3. The samples related to the qualitative survey included 25 faculty members experienced at teaching and the samples related to the quantitative survey included 326 faculty members selected by multistage cluster sampling. A 72-item questionnaire was used to measure the quantitative variables. The reliability of the questionnaire was 0.93. Its content and face validity was determined with the help of faculty members, consultants, and other experts. For the analysis of quantitative data obtained from structural model and regression, SPSS and LISREL were used. The results showed that the status of knowledge sharing is moderate in the universities. Individual factors influencing knowledge sharing included the sharing of educational materials, perception, confidence and knowledge self-efficiency, and organizational factors influencing knowledge sharing included structural social capital, cognitive social capital, social capital relations, organizational communication, organizational structure, organizational culture, IT infrastructure and systems of rewards. Finally, it was found that the contribution of individual factors on knowledge sharing was more than organizational factors; therefore, a model was presented in which contribution of individual and organizational factors were determined.

Keywords: knowledge sharing, social capital, organizational communication, knowledge self-efficiency, perception, trust, organizational culture

Procedia PDF Downloads 389
33625 Semantic Data Schema Recognition

Authors: Aïcha Ben Salem, Faouzi Boufares, Sebastiao Correia

Abstract:

The subject covered in this paper aims at assisting the user in its quality approach. The goal is to better extract, mix, interpret and reuse data. It deals with the semantic schema recognition of a data source. This enables the extraction of data semantics from all the available information, inculding the data and the metadata. Firstly, it consists of categorizing the data by assigning it to a category and possibly a sub-category, and secondly, of establishing relations between columns and possibly discovering the semantics of the manipulated data source. These links detected between columns offer a better understanding of the source and the alternatives for correcting data. This approach allows automatic detection of a large number of syntactic and semantic anomalies.

Keywords: schema recognition, semantic data profiling, meta-categorisation, semantic dependencies inter columns

Procedia PDF Downloads 414
33624 GPU-Based Back-Projection of Synthetic Aperture Radar (SAR) Data onto 3D Reference Voxels

Authors: Joshua Buli, David Pietrowski, Samuel Britton

Abstract:

Processing SAR data usually requires constraints in extent in the Fourier domain as well as approximations and interpolations onto a planar surface to form an exploitable image. This results in a potential loss of data requires several interpolative techniques, and restricts visualization to two-dimensional plane imagery. The data can be interpolated into a ground plane projection, with or without terrain as a component, all to better view SAR data in an image domain comparable to what a human would view, to ease interpretation. An alternate but computationally heavy method to make use of more of the data is the basis of this research. Pre-processing of the SAR data is completed first (matched-filtering, motion compensation, etc.), the data is then range compressed, and lastly, the contribution from each pulse is determined for each specific point in space by searching the time history data for the reflectivity values for each pulse summed over the entire collection. This results in a per-3D-point reflectivity using the entire collection domain. New advances in GPU processing have finally allowed this rapid projection of acquired SAR data onto any desired reference surface (called backprojection). Mathematically, the computations are fast and easy to implement, despite limitations in SAR phase history data size and 3D-point cloud size. Backprojection processing algorithms are embarrassingly parallel since each 3D point in the scene has the same reflectivity calculation applied for all pulses, independent of all other 3D points and pulse data under consideration. Therefore, given the simplicity of the single backprojection calculation, the work can be spread across thousands of GPU threads allowing for accurate reflectivity representation of a scene. Furthermore, because reflectivity values are associated with individual three-dimensional points, a plane is no longer the sole permissible mapping base; a digital elevation model or even a cloud of points (collected from any sensor capable of measuring ground topography) can be used as a basis for the backprojection technique. This technique minimizes any interpolations and modifications of the raw data, maintaining maximum data integrity. This innovative processing will allow for SAR data to be rapidly brought into a common reference frame for immediate exploitation and data fusion with other three-dimensional data and representations.

Keywords: backprojection, data fusion, exploitation, three-dimensional, visualization

Procedia PDF Downloads 75
33623 Restricted Boltzmann Machines and Deep Belief Nets for Market Basket Analysis: Statistical Performance and Managerial Implications

Authors: H. Hruschka

Abstract:

This paper presents the first comparison of the performance of the restricted Boltzmann machine and the deep belief net on binary market basket data relative to binary factor analysis and the two best-known topic models, namely Dirichlet allocation and the correlated topic model. This comparison shows that the restricted Boltzmann machine and the deep belief net are superior to both binary factor analysis and topic models. Managerial implications that differ between the investigated models are treated as well. The restricted Boltzmann machine is defined as joint Boltzmann distribution of hidden variables and observed variables (purchases). It comprises one layer of observed variables and one layer of hidden variables. Note that variables of the same layer are not connected. The comparison also includes deep belief nets with three layers. The first layer is a restricted Boltzmann machine based on category purchases. Hidden variables of the first layer are used as input variables by the second-layer restricted Boltzmann machine which then generates second-layer hidden variables. Finally, in the third layer hidden variables are related to purchases. A public data set is analyzed which contains one month of real-world point-of-sale transactions in a typical local grocery outlet. It consists of 9,835 market baskets referring to 169 product categories. This data set is randomly split into two halves. One half is used for estimation, the other serves as holdout data. Each model is evaluated by the log likelihood for the holdout data. Performance of the topic models is disappointing as the holdout log likelihood of the correlated topic model – which is better than Dirichlet allocation - is lower by more than 25,000 compared to the best binary factor analysis model. On the other hand, binary factor analysis on its own is clearly surpassed by both the restricted Boltzmann machine and the deep belief net whose holdout log likelihoods are higher by more than 23,000. Overall, the deep belief net performs best. We also interpret hidden variables discovered by binary factor analysis, the restricted Boltzmann machine and the deep belief net. Hidden variables characterized by the product categories to which they are related differ strongly between these three models. To derive managerial implications we assess the effect of promoting each category on total basket size, i.e., the number of purchased product categories, due to each category's interdependence with all the other categories. The investigated models lead to very different implications as they disagree about which categories are associated with higher basket size increases due to a promotion. Of course, recommendations based on better performing models should be preferred. The impressive performance advantages of the restricted Boltzmann machine and the deep belief net suggest continuing research by appropriate extensions. To include predictors, especially marketing variables such as price, seems to be an obvious next step. It might also be feasible to take a more detailed perspective by considering purchases of brands instead of purchases of product categories.

Keywords: binary factor analysis, deep belief net, market basket analysis, restricted Boltzmann machine, topic models

Procedia PDF Downloads 194
33622 Modelling Residential Space Heating Energy for Romania

Authors: Ion Smeureanu, Adriana Reveiu, Marian Dardala, Titus Felix Furtuna, Roman Kanala

Abstract:

This paper proposes a linear model for optimizing domestic energy consumption, in Romania. Both techno-economic and consumer behavior approaches have been considered, in order to develop the model. The proposed model aims to reduce the energy consumption, in households, by assembling in a unitary model, aspects concerning: residential lighting, space heating, hot water, and combined space heating – hot water, space cooling, and passenger transport. This paper focuses on space heating domestic energy consumption model, and quantify not only technical-economic issues, but also consumer behavior impact, related to people decision to envelope and insulate buildings, in order to minimize energy consumption.

Keywords: consumer behavior, open source energy modeling system (OSeMOSYS), MARKAL/TIMES Romanian energy model, virtual technologies

Procedia PDF Downloads 537
33621 Predicting Loss of Containment in Surface Pipeline using Computational Fluid Dynamics and Supervised Machine Learning Model to Improve Process Safety in Oil and Gas Operations

Authors: Muhammmad Riandhy Anindika Yudhy, Harry Patria, Ramadhani Santoso

Abstract:

Loss of containment is the primary hazard that process safety management is concerned within the oil and gas industry. Escalation to more serious consequences all begins with the loss of containment, starting with oil and gas release from leakage or spillage from primary containment resulting in pool fire, jet fire and even explosion when reacted with various ignition sources in the operations. Therefore, the heart of process safety management is avoiding loss of containment and mitigating its impact through the implementation of safeguards. The most effective safeguard for the case is an early detection system to alert Operations to take action prior to a potential case of loss of containment. The detection system value increases when applied to a long surface pipeline that is naturally difficult to monitor at all times and is exposed to multiple causes of loss of containment, from natural corrosion to illegal tapping. Based on prior researches and studies, detecting loss of containment accurately in the surface pipeline is difficult. The trade-off between cost-effectiveness and high accuracy has been the main issue when selecting the traditional detection method. The current best-performing method, Real-Time Transient Model (RTTM), requires analysis of closely positioned pressure, flow and temperature (PVT) points in the pipeline to be accurate. Having multiple adjacent PVT sensors along the pipeline is expensive, hence generally not a viable alternative from an economic standpoint.A conceptual approach to combine mathematical modeling using computational fluid dynamics and a supervised machine learning model has shown promising results to predict leakage in the pipeline. Mathematical modeling is used to generate simulation data where this data is used to train the leak detection and localization models. Mathematical models and simulation software have also been shown to provide comparable results with experimental data with very high levels of accuracy. While the supervised machine learning model requires a large training dataset for the development of accurate models, mathematical modeling has been shown to be able to generate the required datasets to justify the application of data analytics for the development of model-based leak detection systems for petroleum pipelines. This paper presents a review of key leak detection strategies for oil and gas pipelines, with a specific focus on crude oil applications, and presents the opportunities for the use of data analytics tools and mathematical modeling for the development of robust real-time leak detection and localization system for surface pipelines. A case study is also presented.

Keywords: pipeline, leakage, detection, AI

Procedia PDF Downloads 185
33620 Optimal Portfolio of Multi-service Provision based on Stochastic Model Predictive Control

Authors: Yifu Ding, Vijay Avinash, Malcolm McCulloch

Abstract:

As the proliferation of decentralized energy systems, the UK power system allows small-scale entities such as microgrids (MGs) to tender multiple energy services including energy arbitrage and frequency responses (FRs). However, its operation requires the balance between the uncertain renewable generations and loads in real-time and has to fulfill their provision requirements of contract services continuously during the time window agreed, otherwise it will be penalized for the under-delivered provision. To hedge against risks due to uncertainties and maximize the economic benefits, we propose a stochastic model predictive control (SMPC) framework to optimize its operation for the multi-service provision. Distinguished from previous works, we include a detailed economic-degradation model of the lithium-ion battery to quantify the costs of different service provisions, as well as accurately describe the changing dynamics of the battery. Considering a branch of load and generation scenarios and the battery aging, we formulate a risk-averse cost function using conditional value at risk (CVaR). It aims to achieve the maximum expected net revenue and avoids severe losses. The framework will be performed on a case study of a PV-battery grid-tied microgrid in the UK with real-life data. To highlight its performance, the framework will be compared with the case without the degradation model and the deterministic formulation.

Keywords: model predictive control (MPC), battery degradation, frequency response, microgrids

Procedia PDF Downloads 118
33619 Iraqi Short Term Electrical Load Forecasting Based on Interval Type-2 Fuzzy Logic

Authors: Firas M. Tuaimah, Huda M. Abdul Abbas

Abstract:

Accurate Short Term Load Forecasting (STLF) is essential for a variety of decision making processes. However, forecasting accuracy can drop due to the presence of uncertainty in the operation of energy systems or unexpected behavior of exogenous variables. Interval Type 2 Fuzzy Logic System (IT2 FLS), with additional degrees of freedom, gives an excellent tool for handling uncertainties and it improved the prediction accuracy. The training data used in this study covers the period from January 1, 2012 to February 1, 2012 for winter season and the period from July 1, 2012 to August 1, 2012 for summer season. The actual load forecasting period starts from January 22, till 28, 2012 for winter model and from July 22 till 28, 2012 for summer model. The real data for Iraqi power system which belongs to the Ministry of Electricity.

Keywords: short term load forecasting, prediction interval, type 2 fuzzy logic systems, electric, computer systems engineering

Procedia PDF Downloads 392
33618 Ecosystem Model for Environmental Applications

Authors: Cristina Schreiner, Romeo Ciobanu, Marius Pislaru

Abstract:

This paper aims to build a system based on fuzzy models that can be implemented in the assessment of ecological systems, to determine appropriate methods of action for reducing adverse effects on environmental and implicit the population. The model proposed provides new perspective for environmental assessment, and it can be used as a practical instrument for decision-making.

Keywords: ecosystem model, environmental security, fuzzy logic, sustainability of habitable regions

Procedia PDF Downloads 418
33617 Analyzing the Effects of Supply and Demand Shocks in the Spanish Economy

Authors: José M Martín-Moreno, Rafaela Pérez, Jesús Ruiz

Abstract:

In this paper we use a small open economy Dynamic Stochastic General Equilibrium Model (DSGE) for the Spanish economy to search for a deeper characterization of the determinants of Spain’s macroeconomic fluctuations throughout the period 1970-2008. In order to do this, we distinguish between tradable and non-tradable goods to take into account the fact that the presence of non-tradable goods in this economy is one of the largest in the world. We estimate a DSGE model with supply and demand shocks (sectorial productivity, public spending, international real interest rate and preferences) using Kalman Filter techniques. We find the following results. First of all, our variance decomposition analysis suggests that 1) the preference shock basically accounts for private consumption volatility, 2) the idiosyncratic productivity shock accounts for non-tradable output volatility, and 3) the sectorial productivity shock along with the international interest rate both greatly account for tradable output. Secondly, the model closely replicates the time path observed in the data for the Spanish economy and finally, the model captures the main cyclical qualitative features of this economy reasonably well.

Keywords: business cycle, DSGE models, Kalman filter estimation, small open economy

Procedia PDF Downloads 414
33616 Understanding Cruise Passengers’ On-board Experience throughout the Customer Decision Journey

Authors: Sabina Akter, Osiris Valdez Banda, Pentti Kujala, Jani Romanoff

Abstract:

This paper examines the relationship between on-board environmental factors and customer overall satisfaction in the context of the cruise on-board experience. The on-board environmental factors considered are ambient, layout/design, social, product/service and on-board enjoyment factors. The study presents a data-driven framework and model for the on-board cruise experience. The data are collected from 893 respondents in an application of a self-administered online questionnaire of their cruise experience. This study reveals the cruise passengers’ on-board experience through the customer decision journey based on the publicly available data. Pearson correlation and regression analysis have been applied, and the results show a positive and a significant relationship between the environmental factors and on-board experience. These data help understand the cruise passengers’ on-board experience, which will be used for the ultimate decision-making process in cruise ship design.

Keywords: cruise behavior, customer activities, on-board environmental factors, on-board experience, user or customer satisfaction

Procedia PDF Downloads 165
33615 Dynamic Process Monitoring of an Ammonia Synthesis Fixed-Bed Reactor

Authors: Bothinah Altaf, Gary Montague, Elaine B. Martin

Abstract:

This study involves the modeling and monitoring of an ammonia synthesis fixed-bed reactor using partial least squares (PLS) and its variants. The process exhibits complex dynamic behavior due to the presence of heat recycling and feed quench. One limitation of static PLS model in this situation is that it does not take account of the process dynamics and hence dynamic PLS was used. Although it showed, superior performance to static PLS in terms of prediction, the monitoring scheme was inappropriate hence adaptive PLS was considered. A limitation of adaptive PLS is that non-conforming observations also contribute to the model, therefore, a new adaptive approach was developed, robust adaptive dynamic PLS. This approach updates a dynamic PLS model and is robust to non-representative data. The developed methodology showed a clear improvement over existing approaches in terms of the modeling of the reactor and the detection of faults.

Keywords: ammonia synthesis fixed-bed reactor, dynamic partial least squares modeling, recursive partial least squares, robust modeling

Procedia PDF Downloads 386
33614 Mathematical and Numerical Analysis of a Nonlinear Cross Diffusion System

Authors: Hassan Al Salman

Abstract:

We consider a nonlinear parabolic cross diffusion model arising in applied mathematics. A fully practical piecewise linear finite element approximation of the model is studied. By using entropy-type inequalities and compactness arguments, existence of a global weak solution is proved. Providing further regularity of the solution of the model, some uniqueness results and error estimates are established. Finally, some numerical experiments are performed.

Keywords: cross diffusion model, entropy-type inequality, finite element approximation, numerical analysis

Procedia PDF Downloads 380
33613 Using Open Source Data and GIS Techniques to Overcome Data Deficiency and Accuracy Issues in the Construction and Validation of Transportation Network: Case of Kinshasa City

Authors: Christian Kapuku, Seung-Young Kho

Abstract:

An accurate representation of the transportation system serving the region is one of the important aspects of transportation modeling. Such representation often requires developing an abstract model of the system elements, which also requires important amount of data, surveys and time. However, in some cases such as in developing countries, data deficiencies, time and budget constraints do not always allow such accurate representation, leaving opportunities to assumptions that may negatively affect the quality of the analysis. With the emergence of Internet open source data especially in the mapping technologies as well as the advances in Geography Information System, opportunities to tackle these issues have raised. Therefore, the objective of this paper is to demonstrate such application through a practical case of the development of the transportation network for the city of Kinshasa. The GIS geo-referencing was used to construct the digitized map of Transportation Analysis Zones using available scanned images. Centroids were then dynamically placed at the center of activities using an activities density map. Next, the road network with its characteristics was built using OpenStreet data and other official road inventory data by intersecting their layers and cleaning up unnecessary links such as residential streets. The accuracy of the final network was then checked, comparing it with satellite images from Google and Bing. For the validation, the final network was exported into Emme3 to check for potential network coding issues. Results show a high accuracy between the built network and satellite images, which can mostly be attributed to the use of open source data.

Keywords: geographic information system (GIS), network construction, transportation database, open source data

Procedia PDF Downloads 165
33612 Holistic Risk Assessment Based on Continuous Data from the User’s Behavior and Environment

Authors: Cinzia Carrodano, Dimitri Konstantas

Abstract:

Risk is part of our lives. In today’s society risk is connected to our safety and safety has become a major priority in our life. Each person lives his/her life based on the evaluation of the risk he/she is ready to accept and sustain, and the level of safety he/she wishes to reach, based on highly personal criteria. The assessment of risk a person takes in a complex environment and the impact of actions of other people’actions and events on our perception of risk are alements to be considered. The concept of Holistic Risk Assessment (HRA) aims in developing a methodology and a model that will allow us to take into account elements outside the direct influence of the individual, and provide a personalized risk assessment. The concept is based on the fact that in the near future, we will be able to gather and process extremely large amounts of data about an individual and his/her environment in real time. The interaction and correlation of these data is the key element of the holistic risk assessment. In this paper, we present the HRA concept and describe the most important elements and considerations.

Keywords: continuous data, dynamic risk, holistic risk assessment, risk concept

Procedia PDF Downloads 122
33611 DenseNet and Autoencoder Architecture for COVID-19 Chest X-Ray Image Classification and Improved U-Net Lung X-Ray Segmentation

Authors: Jonathan Gong

Abstract:

Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.

Keywords: artificial intelligence, convolutional neural networks, deep learning, image processing, machine learning

Procedia PDF Downloads 126
33610 Energy and Exergy Analysis of Anode-Supported and Electrolyte–Supported Solid Oxide Fuel Cells Gas Turbine Power System

Authors: Abdulrazzak Akroot, Lutfu Namli

Abstract:

Solid oxide fuel cells (SOFCs) are one of the most promising technologies since they can produce electricity directly from fuel and generate a lot of waste heat that is generally used in the gas turbines to promote the general performance of the thermal power plant. In this study, the energy, and exergy analysis of a solid oxide fuel cell/gas turbine hybrid system was proceed in MATLAB to examine the performance characteristics of the hybrid system in two different configurations: anode-supported model and electrolyte-supported model. The obtained results indicate that if the fuel utilization factor reduces from 0.85 to 0.65, the overall efficiency decreases from 64.61 to 59.27% for the anode-supported model whereas it reduces from 58.3 to 56.4% for the electrolyte-supported model. Besides, the overall exergy reduces from 53.86 to 44.06% for the anode-supported model whereas it reduces from 39.96 to 33.94% for the electrolyte-supported model. Furthermore, increasing the air utilization factor has a negative impact on the electrical power output and the efficiencies of the overall system due to the reduction in the O₂ concentration at the cathode-electrolyte interface.

Keywords: solid oxide fuel cell, anode-supported model, electrolyte-supported model, energy analysis, exergy analysis

Procedia PDF Downloads 150
33609 An Educational Program Based on Health Belief Model to Prevent of Non-alcoholic Fatty Liver Disease Among Iranian Women

Authors: Arezoo Fallahi

Abstract:

Background and purpose: Non-alcoholic fatty liver is one of the most common liver disorders, which, as the most important cause of death from liver disease, has unpleasant consequences and complications. The aim of this study was to investigate the effect of an educational intervention based on a health belief model to prevent non-alcoholic fatty liver among women. Materials and Methods: This experimental study was performed among 110 women referring to comprehensive health service centers in Malayer City, west of Iran, in 2023. Using the available sampling method, 110 Participants were divided into experimental and control groups. The data collection tool included demographic characteristics and a questionnaire based on the health belief model. In The experimental group, three one-hour training sessions were conducted in the form of pamphlets, lectures and group discussions. Data were analyzed using SPSS software version 21, by correlation tests, paired t-tests independent t-tests. Results: The mean age of participants was 38.07±6.28 years, and Most of the participants were middle-aged, married, housewives with academic education, middle-income and overweight. After the educational intervention, the mean scores of the constructs include perceived sensitivity (p=0.01), perceived severity (p=0.01), perceived benefits (p=0.01), guidance for internal (p=0.01) and external action (p=0.01), and perceived self-efficacy (p=0.01) in the experimental group were significantly higher than the control group. The score of perceived barriers in the experimental group decreased after training. The perceived obstacles score in the test group decreased after the training (15.2 ± 3.9 v.s 11.2 ± 3.3, (p<0.01). Conclusion: The findings of the study showed that the design and implementation of educational programs based on the constructs of the health belief model can be effective in preventing women from developing higher levels of non-alcoholic fatty liver.

Keywords: health, education, believe, behaviour

Procedia PDF Downloads 44
33608 Numerical Modeling of Storm Swells in Harbor by Boussinesq Equations Model

Authors: Mustapha Kamel Mihoubi, Hocine Dahmani

Abstract:

The purpose of work is to study the phenomenon of agitation of storm waves at basin caused by different directions of waves relative to the current provision thrown numerical model based on the equation in shallow water using Boussinesq model MIKE 21 BW. According to the diminishing effect of penetration of a wave optimal solution will be available to be reproduced in reduced model. Another alternative arrangement throws will be proposed to reduce the agitation and the effects of the swell reflection caused by the penetration of waves in the harbor.

Keywords: agitation, Boussinesq equations, combination, harbor

Procedia PDF Downloads 384