Search results for: geometric and topological data models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28308

Search results for: geometric and topological data models

28128 Topological Analysis of Hydrogen Bonds in Pyruvic Acid-Water Mixtures

Authors: Ferid Hammami

Abstract:

The molecular geometries of the possible conformations of pyruvic acid-water complexes (PA-(H₂O)ₙ = 1- 4) have been fully optimized at DFT/B3LYP/6-311G ++ (d, p) levels of calculation. Among several optimized molecular clusters, the most stable molecular arrangements obtained when one, two, three, and four water molecules are hydrogen-bonded to a central pyruvic acid molecule are presented in this paper. Apposite topological and geometrical parameters are considered as primary indicators of H-bond strength. Atoms in molecules (AIM) analysis shows that pyruvic acid can form a ring structure with water, and the molecular structures are stabilized by both strong O-H...O and C-H...O hydrogen bonds. In large clusters, classical O-H...O hydrogen bonds still exist between water molecules, and a cage-like structure is built around some parts of the central molecule of pyruvic acid. The electrostatic potential energy map (MEP) and the HOMO-LUMO molecular orbital (highest occupied molecular orbital-lowest unoccupied molecular orbital) analysis has been performed for all considered complexes.

Keywords: pyruvic acid, PA-water complex, hydrogen bonding, DFT, AIM, MEP, HOMO-LUMO

Procedia PDF Downloads 181
28127 Analysis of Atomic Models in High School Physics Textbooks

Authors: Meng-Fei Cheng, Wei Fneg

Abstract:

New Taiwan high school standards emphasize employing scientific models and modeling practices in physics learning. However, to our knowledge. Few studies address how scientific models and modeling are approached in current science teaching, and they do not examine the views of scientific models portrayed in the textbooks. To explore the views of scientific models and modeling in textbooks, this study investigated the atomic unit in different textbook versions as an example and provided suggestions for modeling curriculum. This study adopted a quantitative analysis of qualitative data in the atomic units of four mainstream version of Taiwan high school physics textbooks. The models were further analyzed using five dimensions of the views of scientific models (nature of models, multiple models, purpose of the models, testing models, and changing models); each dimension had three levels (low, medium, high). Descriptive statistics were employed to compare the frequency of describing the five dimensions of the views of scientific models in the atomic unit to understand the emphasis of the views and to compare the frequency of the eight scientific models’ use to investigate the atomic model that was used most often in the textbooks. Descriptive statistics were further utilized to investigate the average levels of the five dimensions of the views of scientific models to examine whether the textbooks views were close to the scientific view. The average level of the five dimensions of the eight atomic models were also compared to examine whether the views of the eight atomic models were close to the scientific views. The results revealed the following three major findings from the atomic unit. (1) Among the five dimensions of the views of scientific models, the most portrayed dimension was the 'purpose of models,' and the least portrayed dimension was 'multiple models.' The most diverse view was the 'purpose of models,' and the most sophisticated scientific view was the 'nature of models.' The least sophisticated scientific view was 'multiple models.' (2) Among the eight atomic models, the most mentioned model was the atomic nucleus model, and the least mentioned model was the three states of matter. (3) Among the correlations between the five dimensions, the dimension of 'testing models' was highly related to the dimension of 'changing models.' In short, this study examined the views of scientific models based on the atomic units of physics textbooks to identify the emphasized and disregarded views in the textbooks. The findings suggest how future textbooks and curriculum can provide a thorough view of scientific models to enhance students' model-based learning.

Keywords: atomic models, textbooks, science education, scientific model

Procedia PDF Downloads 124
28126 A Review on Water Models of Surface Water Environment

Authors: Shahbaz G. Hassan

Abstract:

Water quality models are very important to predict the changes in surface water quality for environmental management. The aim of this paper is to give an overview of the water qualities, and to provide directions for selecting models in specific situation. Water quality models include one kind of model based on a mechanistic approach, while other models simulate water quality without considering a mechanism. Mechanistic models can be widely applied and have capabilities for long-time simulation, with highly complexity. Therefore, more spaces are provided to explain the principle and application experience of mechanistic models. Mechanism models have certain assumptions on rivers, lakes and estuaries, which limits the application range of the model, this paper introduces the principles and applications of water quality model based on the above three scenarios. On the other hand, mechanistic models are more easily to compute, and with no limit to the geographical conditions, but they cannot be used with confidence to simulate long term changes. This paper divides the empirical models into two broad categories according to the difference of mathematical algorithm, models based on artificial intelligence and models based on statistical methods.

Keywords: empirical models, mathematical, statistical, water quality

Procedia PDF Downloads 231
28125 Teachers’ Instructional Decisions When Teaching Geometric Transformations

Authors: Lisa Kasmer

Abstract:

Teachers’ instructional decisions shape the structure and content of mathematics lessons and influence the mathematics that students are given the opportunity to learn. Therefore, it is important to better understand how teachers make instructional decisions and thus find new ways to help practicing and future teachers give their students a more effective and robust learning experience. Understanding the relationship between teachers’ instructional decisions and their goals, resources, and orientations (beliefs) is important given the heightened focus on geometric transformations in the middle school mathematics curriculum. This work is significant as the development and support of current and future teachers need more effective ways to teach geometry to their students. The following research questions frame this study: (1) As middle school mathematics teachers plan and enact instruction related to teaching transformations, what thinking processes do they engage in to make decisions about teaching transformations with or without a coordinate system and (2) How do the goals, resources and orientations of these teachers impact their instructional decisions and reveal about their understanding of teaching transformations? Teachers and students alike struggle with understanding transformations; many teachers skip or hurriedly teach transformations at the end of the school year. However, transformations are an important mathematical topic as this topic supports students’ understanding of geometric and spatial reasoning. Geometric transformations are a foundational concept in mathematics, not only for understanding congruence and similarity but for proofs, algebraic functions, and calculus etc. Geometric transformations also underpin the secondary mathematics curriculum, as features of transformations transfer to other areas of mathematics. Teachers’ instructional decisions in terms of goals, orientations, and resources that support these instructional decisions were analyzed using open-coding. Open-coding is recognized as an initial first step in qualitative analysis, where comparisons are made, and preliminary categories are considered. Initial codes and categories from current research on teachers’ thinking processes that are related to the decisions they make while planning and reflecting on the lessons were also noted. Surfacing ideas and additional themes common across teachers while seeking patterns, were compared and analyzed. Finally, attributes of teachers’ goals, orientations and resources were identified in order to begin to build a picture of the reasoning behind their instructional decisions. These categories became the basis for the organization and conceptualization of the data. Preliminary results suggest that teachers often rely on their own orientations about teaching geometric transformations. These beliefs are underpinned by the teachers’ own mathematical knowledge related to teaching transformations. When a teacher does not have a robust understanding of transformations, they are limited by this lack of knowledge. These shortcomings impact students’ opportunities to learn, and thus disadvantage their own understanding of transformations. Teachers’ goals are also limited by their paucity of knowledge regarding transformations, as these goals do not fully represent the range of comprehension a teacher needs to teach this topic well.

Keywords: coordinate plane, geometric transformations, instructional decisions, middle school mathematics

Procedia PDF Downloads 59
28124 Benchmarking Machine Learning Approaches for Forecasting Hotel Revenue

Authors: Rachel Y. Zhang, Christopher K. Anderson

Abstract:

A critical aspect of revenue management is a firm’s ability to predict demand as a function of price. Historically hotels have used simple time series models (regression and/or pick-up based models) owing to the complexities of trying to build casual models of demands. Machine learning approaches are slowly attracting attention owing to their flexibility in modeling relationships. This study provides an overview of approaches to forecasting hospitality demand – focusing on the opportunities created by machine learning approaches, including K-Nearest-Neighbors, Support vector machine, Regression Tree, and Artificial Neural Network algorithms. The out-of-sample performances of above approaches to forecasting hotel demand are illustrated by using a proprietary sample of the market level (24 properties) transactional data for Las Vegas NV. Causal predictive models can be built and evaluated owing to the availability of market level (versus firm level) data. This research also compares and contrast model accuracy of firm-level models (i.e. predictive models for hotel A only using hotel A’s data) to models using market level data (prices, review scores, location, chain scale, etc… for all hotels within the market). The prospected models will be valuable for hotel revenue prediction given the basic characters of a hotel property or can be applied in performance evaluation for an existed hotel. The findings will unveil the features that play key roles in a hotel’s revenue performance, which would have considerable potential usefulness in both revenue prediction and evaluation.

Keywords: hotel revenue, k-nearest-neighbors, machine learning, neural network, prediction model, regression tree, support vector machine

Procedia PDF Downloads 105
28123 A Numerical Hybrid Finite Element Model for Lattice Structures Using 3D/Beam Elements

Authors: Ahmadali Tahmasebimoradi, Chetra Mang, Xavier Lorang

Abstract:

Thanks to the additive manufacturing process, lattice structures are replacing the traditional structures in aeronautical and automobile industries. In order to evaluate the mechanical response of the lattice structures, one has to resort to numerical techniques. Ansys is a globally well-known and trusted commercial software that allows us to model the lattice structures and analyze their mechanical responses using either solid or beam elements. In this software, a script may be used to systematically generate the lattice structures for any size. On the one hand, solid elements allow us to correctly model the contact between the substrates (the supports of the lattice structure) and the lattice structure, the local plasticity, and the junctions of the microbeams. However, their computational cost increases rapidly with the size of the lattice structure. On the other hand, although beam elements reduce the computational cost drastically, it doesn’t correctly model the contact between the lattice structures and the substrates nor the junctions of the microbeams. Also, the notion of local plasticity is not valid anymore. Moreover, the deformed shape of the lattice structure doesn’t correspond to the deformed shape of the lattice structure using 3D solid elements. In this work, motivated by the pros and cons of the 3D and beam models, a numerically hybrid model is presented for the lattice structures to reduce the computational cost of the simulations while avoiding the aforementioned drawbacks of the beam elements. This approach consists of the utilization of solid elements for the junctions and beam elements for the microbeams connecting the corresponding junctions to each other. When the global response of the structure is linear, the results from the hybrid models are in good agreement with the ones from the 3D models for body-centered cubic with z-struts (BCCZ) and body-centered cubic without z-struts (BCC) lattice structures. However, the hybrid models have difficulty to converge when the effect of large deformation and local plasticity are considerable in the BCCZ structures. Furthermore, the effect of the junction’s size of the hybrid models on the results is investigated. For BCCZ lattice structures, the results are not affected by the junction’s size. This is also valid for BCC lattice structures as long as the ratio of the junction’s size to the diameter of the microbeams is greater than 2. The hybrid model can take into account the geometric defects. As a demonstration, the point clouds of two lattice structures are parametrized in a platform called LATANA (LATtice ANAlysis) developed by IRT-SystemX. In this process, for each microbeam of the lattice structures, an ellipse is fitted to capture the effect of shape variation and roughness. Each ellipse is represented by three parameters; semi-major axis, semi-minor axis, and angle of rotation. Having the parameters of the ellipses, the lattice structures are constructed in Spaceclaim (ANSYS) using the geometrical hybrid approach. The results show a negligible discrepancy between the hybrid and 3D models, while the computational cost of the hybrid model is lower than the computational cost of the 3D model.

Keywords: additive manufacturing, Ansys, geometric defects, hybrid finite element model, lattice structure

Procedia PDF Downloads 88
28122 Reliability Estimation of Bridge Structures with Updated Finite Element Models

Authors: Ekin Ozer

Abstract:

Assessment of structural reliability is essential for efficient use of civil infrastructure which is subjected hazardous events. Dynamic analysis of finite element models is a commonly used tool to simulate structural behavior and estimate its performance accordingly. However, theoretical models purely based on preliminary assumptions and design drawings may deviate from the actual behavior of the structure. This study proposes up-to-date reliability estimation procedures which engages actual bridge vibration data modifying finite element models for finite element model updating and performing reliability estimation, accordingly. The proposed method utilizes vibration response measurements of bridge structures to identify modal parameters, then uses these parameters to calibrate finite element models which are originally based on design drawings. The proposed method does not only show that reliability estimation based on updated models differs from the original models, but also infer that non-updated models may overestimate the structural capacity.

Keywords: earthquake engineering, engineering vibrations, reliability estimation, structural health monitoring

Procedia PDF Downloads 178
28121 Data Challenges Facing Implementation of Road Safety Management Systems in Egypt

Authors: A. Anis, W. Bekheet, A. El Hakim

Abstract:

Implementing a Road Safety Management System (SMS) in a crowded developing country such as Egypt is a necessity. Beginning a sustainable SMS requires a comprehensive reliable data system for all information pertinent to road crashes. In this paper, a survey for the available data in Egypt and validating it for using in an SMS in Egypt. The research provides some missing data, and refer to the unavailable data in Egypt, looking forward to the contribution of the scientific society, the authorities, and the public in solving the problem of missing or unreliable crash data. The required data for implementing an SMS in Egypt are divided into three categories; the first is available data such as fatality and injury rates and it is proven in this research that it may be inconsistent and unreliable, the second category of data is not available, but it may be estimated, an example of estimating vehicle cost is available in this research, the third is not available and can be measured case by case such as the functional and geometric properties of a facility. Some inquiries are provided in this research for the scientific society, such as how to improve the links among stakeholders of road safety in order to obtain a consistent, non-biased, and reliable data system.

Keywords: road safety management system, road crash, road fatality, road injury

Procedia PDF Downloads 87
28120 Using Arellano-Bover/Blundell-Bond Estimator in Dynamic Panel Data Analysis – Case of Finnish Housing Price Dynamics

Authors: Janne Engblom, Elias Oikarinen

Abstract:

A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models are dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Arellano-Bover/Blundell-Bond Generalized method of moments (GMM) estimator which is an extension of the Arellano-Bond model where past values and different transformations of past values of the potentially problematic independent variable are used as instruments together with other instrumental variables. The Arellano–Bover/Blundell–Bond estimator augments Arellano–Bond by making an additional assumption that first differences of instrument variables are uncorrelated with the fixed effects. This allows the introduction of more instruments and can dramatically improve efficiency. It builds a system of two equations—the original equation and the transformed one—and is also known as system GMM. In this study, Finnish housing price dynamics were examined empirically by using the Arellano–Bover/Blundell–Bond estimation technique together with ordinary OLS. The aim of the analysis was to provide a comparison between conventional fixed-effects panel data models and dynamic panel data models. The Arellano–Bover/Blundell–Bond estimator is suitable for this analysis for a number of reasons: It is a general estimator designed for situations with 1) a linear functional relationship; 2) one left-hand-side variable that is dynamic, depending on its own past realizations; 3) independent variables that are not strictly exogenous, meaning they are correlated with past and possibly current realizations of the error; 4) fixed individual effects; and 5) heteroskedasticity and autocorrelation within individuals but not across them. Based on data of 14 Finnish cities over 1988-2012 differences of short-run housing price dynamics estimates were considerable when different models and instrumenting were used. Especially, the use of different instrumental variables caused variation of model estimates together with their statistical significance. This was particularly clear when comparing estimates of OLS with different dynamic panel data models. Estimates provided by dynamic panel data models were more in line with theory of housing price dynamics.

Keywords: dynamic model, fixed effects, panel data, price dynamics

Procedia PDF Downloads 1435
28119 Non–Geometric Sensitivities Using the Adjoint Method

Authors: Marcelo Hayashi, João Lima, Bruno Chieregatti, Ernani Volpe

Abstract:

The adjoint method has been used as a successful tool to obtain sensitivity gradients in aerodynamic design and optimisation for many years. This work presents an alternative approach to the continuous adjoint formulation that enables one to compute gradients of a given measure of merit with respect to control parameters other than those pertaining to geometry. The procedure is then applied to the steady 2–D compressible Euler and incompressible Navier–Stokes flow equations. Finally, the results are compared with sensitivities obtained by finite differences and theoretical values for validation.

Keywords: adjoint method, aerodynamics, sensitivity theory, non-geometric sensitivities

Procedia PDF Downloads 517
28118 A Review on 3D Smart City Platforms Using Remotely Sensed Data to Aid Simulation and Urban Analysis

Authors: Slim Namouchi, Bruno Vallet, Imed Riadh Farah

Abstract:

3D urban models provide powerful tools for decision making, urban planning, and smart city services. The accuracy of this 3D based systems is directly related to the quality of these models. Since manual large-scale modeling, such as cities or countries is highly time intensive and very expensive process, a fully automatic 3D building generation is needed. However, 3D modeling process result depends on the input data, the proprieties of the captured objects, and the required characteristics of the reconstructed 3D model. Nowadays, producing 3D real-world model is no longer a problem. Remotely sensed data had experienced a remarkable increase in the recent years, especially data acquired using unmanned aerial vehicles (UAV). While the scanning techniques are developing, the captured data amount and the resolution are getting bigger and more precise. This paper presents a literature review, which aims to identify different methods of automatic 3D buildings extractions either from LiDAR or the combination of LiDAR and satellite or aerial images. Then, we present open source technologies, and data models (e.g., CityGML, PostGIS, Cesiumjs) used to integrate these models in geospatial base layers for smart city services.

Keywords: CityGML, LiDAR, remote sensing, SIG, Smart City, 3D urban modeling

Procedia PDF Downloads 102
28117 Nano Generalized Topology

Authors: M. Y. Bakeir

Abstract:

Rough set theory is a recent approach for reasoning about data. It has achieved a large amount of applications in various real-life fields. The main idea of rough sets corresponds to the lower and upper set approximations. These two approximations are exactly the interior and the closure of the set with respect to a certain topology on a collection U of imprecise data acquired from any real-life field. The base of the topology is formed by equivalence classes of an equivalence relation E defined on U using the available information about data. The theory of generalized topology was studied by Cs´asz´ar. It is well known that generalized topology in the sense of Cs´asz´ar is a generalization of the topology on a set. On the other hand, many important collections of sets related with the topology on a set form a generalized topology. The notion of Nano topology was introduced by Lellis Thivagar, which was defined in terms of approximations and boundary region of a subset of an universe using an equivalence relation on it. The purpose of this paper is to introduce a new generalized topology in terms of rough set called nano generalized topology

Keywords: rough sets, topological space, generalized topology, nano topology

Procedia PDF Downloads 405
28116 A CFD Analysis of Hydraulic Characteristics of the Rod Bundles in the BREST-OD-300 Wire-Spaced Fuel Assemblies

Authors: Dmitry V. Fomichev, Vladimir V. Solonin

Abstract:

This paper presents the findings from a numerical simulation of the flow in 37-rod fuel assembly models spaced by a double-wire trapezoidal wrapping as applied to the BREST-OD-300 experimental nuclear reactor. Data on a high static pressure distribution within the models, and equations for determining the fuel bundle flow friction factors have been obtained. Recommendations are provided on using the closing turbulence models available in the ANSYS Fluent. A comparative analysis has been performed against the existing empirical equations for determining the flow friction factors. The calculated and experimental data fit has been shown. An analysis into the experimental data and results of the numerical simulation of the BREST-OD-300 fuel rod assembly hydrodynamic performance are presented.

Keywords: BREST-OD-300, ware-spaces, fuel assembly, computation fluid dynamics

Procedia PDF Downloads 351
28115 AS-Geo: Arbitrary-Sized Image Geolocalization with Learnable Geometric Enhancement Resizer

Authors: Huayuan Lu, Chunfang Yang, Ma Zhu, Baojun Qi, Yaqiong Qiao, Jiangqian Xu

Abstract:

Image geolocalization has great application prospects in fields such as autonomous driving and virtual/augmented reality. In practical application scenarios, the size of the image to be located is not fixed; it is impractical to train different networks for all possible sizes. When its size does not match the size of the input of the descriptor extraction model, existing image geolocalization methods usually directly scale or crop the image in some common ways. This will result in the loss of some information important to the geolocalization task, thus affecting the performance of the image geolocalization method. For example, excessive down-sampling can lead to blurred building contour, and inappropriate cropping can lead to the loss of key semantic elements, resulting in incorrect geolocation results. To address this problem, this paper designs a learnable image resizer and proposes an arbitrary-sized image geolocation method. (1) The designed learnable image resizer employs the self-attention mechanism to enhance the geometric features of the resized image. Firstly, it applies bilinear interpolation to the input image and its feature maps to obtain the initial resized image and the resized feature maps. Then, SKNet (selective kernel net) is used to approximate the best receptive field, thus keeping the geometric shapes as the original image. And SENet (squeeze and extraction net) is used to automatically select the feature maps with strong contour information, enhancing the geometric features. Finally, the enhanced geometric features are fused with the initial resized image, to obtain the final resized images. (2) The proposed image geolocalization method embeds the above image resizer as a fronting layer of the descriptor extraction network. It not only enables the network to be compatible with arbitrary-sized input images but also enhances the geometric features that are crucial to the image geolocalization task. Moreover, the triplet attention mechanism is added after the first convolutional layer of the backbone network to optimize the utilization of geometric elements extracted by the first convolutional layer. Finally, the local features extracted by the backbone network are aggregated to form image descriptors for image geolocalization. The proposed method was evaluated on several mainstream datasets, such as Pittsburgh30K, Tokyo24/7, and Places365. The results show that the proposed method has excellent size compatibility and compares favorably to recently mainstream geolocalization methods.

Keywords: image geolocalization, self-attention mechanism, image resizer, geometric feature

Procedia PDF Downloads 178
28114 Self-Supervised Pretraining on Sequences of Functional Magnetic Resonance Imaging Data for Transfer Learning to Brain Decoding Tasks

Authors: Sean Paulsen, Michael Casey

Abstract:

In this work we present a self-supervised pretraining framework for transformers on functional Magnetic Resonance Imaging (fMRI) data. First, we pretrain our architecture on two self-supervised tasks simultaneously to teach the model a general understanding of the temporal and spatial dynamics of human auditory cortex during music listening. Our pretraining results are the first to suggest a synergistic effect of multitask training on fMRI data. Second, we finetune the pretrained models and train additional fresh models on a supervised fMRI classification task. We observe significantly improved accuracy on held-out runs with the finetuned models, which demonstrates the ability of our pretraining tasks to facilitate transfer learning. This work contributes to the growing body of literature on transformer architectures for pretraining and transfer learning with fMRI data, and serves as a proof of concept for our pretraining tasks and multitask pretraining on fMRI data.

Keywords: transfer learning, fMRI, self-supervised, brain decoding, transformer, multitask training

Procedia PDF Downloads 60
28113 Chemometric Estimation of Inhibitory Activity of Benzimidazole Derivatives by Linear Least Squares and Artificial Neural Networks Modelling

Authors: Sanja O. Podunavac-Kuzmanović, Strahinja Z. Kovačević, Lidija R. Jevrić, Stela Jokić

Abstract:

The subject of this paper is to correlate antibacterial behavior of benzimidazole derivatives with their molecular characteristics using chemometric QSAR (Quantitative Structure–Activity Relationships) approach. QSAR analysis has been carried out on the inhibitory activity of benzimidazole derivatives against Staphylococcus aureus. The data were processed by linear least squares (LLS) and artificial neural network (ANN) procedures. The LLS mathematical models have been developed as a calibration models for prediction of the inhibitory activity. The quality of the models was validated by leave one out (LOO) technique and by using external data set. High agreement between experimental and predicted inhibitory acivities indicated the good quality of the derived models. These results are part of the CMST COST Action No. CM1306 "Understanding Movement and Mechanism in Molecular Machines".

Keywords: Antibacterial, benzimidazoles, chemometric, QSAR.

Procedia PDF Downloads 288
28112 Competitor Integration with Voice of Customer Ratings in QFD Studies Using Geometric Mean Based on AHP

Authors: Zafar Iqbal, Nigel P. Grigg, K. Govindaraju, Nicola M. Campbell-Allen

Abstract:

Quality Function Deployment (QFD) is structured approach. It has been used to improve the quality of products and process in a wide range of fields. Using this systematic tool, practitioners normally rank Voice of Customer ratings (VoCs) in order to produce Improvement Ratios (IRs) which become the basis for prioritising process / product design or improvement activities. In one matrix of the House of Quality (HOQ) competitors are rated. The method of obtaining improvement ratios (IRs) does not always integrate the competitors’ rating in a systematic way that fully utilises competitor rating information. This can have the effect of diverting QFD practitioners’ attention from a potentially important VOC to less important VOC. In order to enhance QFD analysis, we present a more systematic method for integrating competitor ratings, utilising the geometric mean of the customer rating matrix. In this paper we develop a new approach, based on the Analytic Hierarchy Process (AHP), in which we generating a matrix of multiple comparisons of all competitors, and derive a geometric mean for each competitor. For each VOC an improved IR is derived which-we argue herein - enhances the initial VOC importance ratings by integrating more information about competitor performance. In this way, our method can help overcome one of the possible shortcomings of QFD. We then use a published QFD example from literature as a case study to demonstrate the use of the new AHP-based IRs, and show how these can be used to re-rank existing VOCs to -arguably- better achieve the goal of customer satisfaction in relation VOC ratings and competitors’ rankings. We demonstrate how two dimensional AHP-based geometric mean derived from the multiple competitor comparisons matrix can be useful for analysing competitors’ rankings. Our method utilises an established methodology (AHP) applied within an established application (QFD), but in an original way (through the competitor analysis matrix), to achieve a novel improvement.

Keywords: quality function deployment, geometric mean, improvement ratio, AHP, competitors ratings

Procedia PDF Downloads 335
28111 Monomial Form Approach to Rectangular Surface Modeling

Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong

Abstract:

Geometric modeling plays an important role in the constructions and manufacturing of curve, surface and solid modeling. Their algorithms are critically important not only in the automobile, ship and aircraft manufacturing business, but are also absolutely necessary in a wide variety of modern applications, e.g., robotics, optimization, computer vision, data analytics and visualization. The calculation and display of geometric objects can be accomplished by these six techniques: Polynomial basis, Recursive, Iterative, Coefficient matrix, Polar form approach and Pyramidal algorithms. In this research, the coefficient matrix (simply called monomial form approach) will be used to model polynomial rectangular patches, i.e., Said-Ball, Wang-Ball, DP, Dejdumrong and NB1 surfaces. Some examples of the monomial forms for these surface modeling are illustrated in many aspects, e.g., construction, derivatives, model transformation, degree elevation and degress reduction.

Keywords: monomial forms, rectangular surfaces, CAGD curves, monomial matrix applications

Procedia PDF Downloads 123
28110 The Grit in the Glamour: A Qualitative Study of the Well-Being of Fashion Models

Authors: Emily Fortune Super, Ameerah Khadaroo, Aurore Bardey

Abstract:

Fashion models are often assumed to have a glamorous job with limited consideration for their well-being. This study aims to assess the well-being of models through semi-structured interviews with six professional fashion models and six industry professionals. Thematic analysis revealed that although models experienced improved self-confidence, they also reported heightened anxiety levels, body image issues, and the negative influence of modelling on their self-esteem. By contrast, industry professionals reported no or minimum concerns about anxious behaviours or the general well-being of fashion models. Being resilient as a model was perceived as an essential attribute to have by both models and industry professionals as they face recurrent rejection in this industry. These results demonstrate a significant gap in the current understanding of the well-being of fashion models between industry professionals and the models themselves. Findings imply that there is an inherent need for change in the modelling industry to promote and enhance their well-being.

Keywords: body image, fashion industry, modelling, well-being

Procedia PDF Downloads 141
28109 Evaluation of Football Forecasting Models: 2021 Brazilian Championship Case Study

Authors: Flavio Cordeiro Fontanella, Asla Medeiros e Sá, Moacyr Alvim Horta Barbosa da Silva

Abstract:

In the present work, we analyse the performance of football results forecasting models. In order to do so, we have performed the data collection from eight different forecasting models during the 2021 Brazilian football season. First, we guide the analysis through visual representations of the data, designed to highlight the most prominent features and enhance the interpretation of differences and similarities between the models. We propose using a 2-simplex triangle to investigate visual patterns from the results forecasting models. Next, we compute the expected points for every team playing in the championship and compare them to the final league standings, revealing interesting contrasts between actual to expected performances. Then, we evaluate forecasts’ accuracy using the Ranked Probability Score (RPS); models comparison accounts for tiny scale differences that may become consistent in time. Finally, we observe that the Wisdom of Crowds principle can be appropriately applied in the context, driving into a discussion of results forecasts usage in practice. This paper’s primary goal is to encourage football forecasts’ performance discussion. We hope to accomplish it by presenting appropriate criteria and easy-to-understand visual representations that can point out the relevant factors of the subject.

Keywords: accuracy evaluation, Brazilian championship, football results forecasts, forecasting models, visual analysis

Procedia PDF Downloads 63
28108 Superiority of High Frequency Based Volatility Models: Empirical Evidence from an Emerging Market

Authors: Sibel Celik, Hüseyin Ergin

Abstract:

The paper aims to find the best volatility forecasting model for stock markets in Turkey. For this purpose, we compare performance of different volatility models-both traditional GARCH model and high frequency based volatility models- and conclude that both in pre-crisis and crisis period, the performance of high frequency based volatility models are better than traditional GARCH model. The findings of paper are important for policy makers, financial institutions and investors.

Keywords: volatility, GARCH model, realized volatility, high frequency data

Procedia PDF Downloads 468
28107 A Framework on Data and Remote Sensing for Humanitarian Logistics

Authors: Vishnu Nagendra, Marten Van Der Veen, Stefania Giodini

Abstract:

Effective humanitarian logistics operations are a cornerstone in the success of disaster relief operations. However, for effectiveness, they need to be demand driven and supported by adequate data for prioritization. Without this data operations are carried out in an ad hoc manner and eventually become chaotic. The current availability of geospatial data helps in creating models for predictive damage and vulnerability assessment, which can be of great advantage to logisticians to gain an understanding on the nature and extent of the disaster damage. This translates into actionable information on the demand for relief goods, the state of the transport infrastructure and subsequently the priority areas for relief delivery. However, due to the unpredictable nature of disasters, the accuracy in the models need improvement which can be done using remote sensing data from UAVs (Unmanned Aerial Vehicles) or satellite imagery, which again come with certain limitations. This research addresses the need for a framework to combine data from different sources to support humanitarian logistic operations and prediction models. The focus is on developing a workflow to combine data from satellites and UAVs post a disaster strike. A three-step approach is followed: first, the data requirements for logistics activities are made explicit, which is done by carrying out semi-structured interviews with on field logistics workers. Second, the limitations in current data collection tools are analyzed to develop workaround solutions by following a systems design approach. Third, the data requirements and the developed workaround solutions are fit together towards a coherent workflow. The outcome of this research will provide a new method for logisticians to have immediately accurate and reliable data to support data-driven decision making.

Keywords: unmanned aerial vehicles, damage prediction models, remote sensing, data driven decision making

Procedia PDF Downloads 351
28106 Mapping Poverty in the Philippines: Insights from Satellite Data and Spatial Econometrics

Authors: Htet Khaing Lin

Abstract:

This study explores the relationship between a diverse set of variables, encompassing both environmental and socio-economic factors, and poverty levels in the Philippines for the years 2012, 2015, and 2018. Employing Ordinary Least Squares (OLS), Spatial Lag Models (SLM), and Spatial Error Models (SEM), this study delves into the dynamics of key indicators, including daytime and nighttime land surface temperature, cropland surface, urban land surface, rainfall, population size, normalized difference water, vegetation, and drought indices. The findings reveal consistent patterns and unexpected correlations, highlighting the need for nuanced policies that address the multifaceted challenges arising from the interplay of environmental and socio-economic factors.

Keywords: poverty analysis, OLS, spatial lag models, spatial error models, Philippines, google earth engine, satellite data, environmental dynamics, socio-economic factors

Procedia PDF Downloads 60
28105 Assimilating Remote Sensing Data Into Crop Models: A Global Systematic Review

Authors: Luleka Dlamini, Olivier Crespo, Jos van Dam

Abstract:

Accurately estimating crop growth and yield is pivotal for timely sustainable agricultural management and ensuring food security. Crop models and remote sensing can complement each other and form a robust analysis tool to improve crop growth and yield estimations when combined. This study thus aims to systematically evaluate how research that exclusively focuses on assimilating RS data into crop models varies among countries, crops, data assimilation methods, and farming conditions. A strict search string was applied in the Scopus and Web of Science databases, and 497 potential publications were obtained. After screening for relevance with predefined inclusion/exclusion criteria, 123 publications were considered in the final review. Results indicate that over 81% of the studies were conducted in countries associated with high socio-economic and technological advancement, mainly China, the United States of America, France, Germany, and Italy. Many of these studies integrated MODIS or Landsat data into WOFOST to improve crop growth and yield estimation of staple crops at the field and regional scales. Most studies use recalibration or updating methods alongside various algorithms to assimilate remotely sensed leaf area index into crop models. However, these methods cannot account for the uncertainties in remote sensing observations and the crop model itself. l. Over 85% of the studies were based on commercial and irrigated farming systems. Despite a great global interest in data assimilation into crop models, limited research has been conducted in resource- and data-limited regions like Africa. We foresee a great potential for such application in those conditions. Hence facilitating and expanding the use of such an approach, from which developing farming communities could benefit.

Keywords: crop models, remote sensing, data assimilation, crop yield estimation

Procedia PDF Downloads 91
28104 Assimilating Remote Sensing Data into Crop Models: A Global Systematic Review

Authors: Luleka Dlamini, Olivier Crespo, Jos van Dam

Abstract:

Accurately estimating crop growth and yield is pivotal for timely sustainable agricultural management and ensuring food security. Crop models and remote sensing can complement each other and form a robust analysis tool to improve crop growth and yield estimations when combined. This study thus aims to systematically evaluate how research that exclusively focuses on assimilating RS data into crop models varies among countries, crops, data assimilation methods, and farming conditions. A strict search string was applied in the Scopus and Web of Science databases, and 497 potential publications were obtained. After screening for relevance with predefined inclusion/exclusion criteria, 123 publications were considered in the final review. Results indicate that over 81% of the studies were conducted in countries associated with high socio-economic and technological advancement, mainly China, the United States of America, France, Germany, and Italy. Many of these studies integrated MODIS or Landsat data into WOFOST to improve crop growth and yield estimation of staple crops at the field and regional scales. Most studies use recalibration or updating methods alongside various algorithms to assimilate remotely sensed leaf area index into crop models. However, these methods cannot account for the uncertainties in remote sensing observations and the crop model itself. l. Over 85% of the studies were based on commercial and irrigated farming systems. Despite a great global interest in data assimilation into crop models, limited research has been conducted in resource- and data-limited regions like Africa. We foresee a great potential for such application in those conditions. Hence facilitating and expanding the use of such an approach, from which developing farming communities could benefit.

Keywords: crop models, remote sensing, data assimilation, crop yield estimation

Procedia PDF Downloads 54
28103 Data Collection with Bounded-Sized Messages in Wireless Sensor Networks

Authors: Min Kyung An

Abstract:

In this paper, we study the data collection problem in Wireless Sensor Networks (WSNs) adopting the two interference models: The graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR). The main issue of the problem is to compute schedules with the minimum number of timeslots, that is, to compute the minimum latency schedules, such that data from every node can be collected without any collision or interference to a sink node. While existing works studied the problem with unit-sized and unbounded-sized message models, we investigate the problem with the bounded-sized message model, and introduce a constant factor approximation algorithm. To the best known of our knowledge, our result is the first result of the data collection problem with bounded-sized model in both interference models.

Keywords: data collection, collision-free, interference-free, physical interference model, SINR, approximation, bounded-sized message model, wireless sensor networks

Procedia PDF Downloads 188
28102 Comparative Analysis of Effecting Factors on Fertility by Birth Order: A Hierarchical Approach

Authors: Ali Hesari, Arezoo Esmaeeli

Abstract:

Regarding to dramatic changes of fertility and higher order births during recent decades in Iran, access to knowledge about affecting factors on different birth orders has crucial importance. In this study, According to hierarchical structure of many of social sciences data and the effect of variables of different levels of social phenomena that determine different birth orders in 365 days ending to 1390 census have been explored by multilevel approach. In this paper, 2% individual row data for 1390 census is analyzed by HLM software. Three different hierarchical linear regression models are estimated for data analysis of the first and second, third, fourth and more birth order. Research results displays different outcomes for three models. Individual level variables entered in equation are; region of residence (rural/urban), age, educational level and labor participation status and province level variable is GDP per capita. Results show that individual level variables have different effects in these three models and in second level we have different random and fixed effects in these models.

Keywords: fertility, birth order, hierarchical approach, fixe effects, random effects

Procedia PDF Downloads 315
28101 Measuring Environmental Efficiency of Energy in OPEC Countries

Authors: Bahram Fathi, Seyedhossein Sajadifar, Naser Khiabani

Abstract:

Data envelopment analysis (DEA) has recently gained popularity in energy efficiency analysis. A common feature of the previously proposed DEA models for measuring energy efficiency performance is that they treat energy consumption as an input within a production framework without considering undesirable outputs. However, energy use results in the generation of undesirable outputs as byproducts of producing desirable outputs. Within a joint production framework of both desirable and undesirable outputs, this paper presents several DEA-type linear programming models for measuring energy efficiency performance. In addition to considering undesirable outputs, our models treat different energy sources as different inputs so that changes in energy mix could be accounted for in evaluating energy efficiency. The proposed models are applied to measure the energy efficiency performances of 12 OPEC countries and the results obtained are presented.

Keywords: energy efficiency, undesirable outputs, data envelopment analysis

Procedia PDF Downloads 705
28100 Mathematical Modeling of Drip Emitter Discharge of Trapezoidal Labyrinth Channel

Authors: N. Philipova

Abstract:

The influence of the geometric parameters of trapezoidal labyrinth channel on the emitter discharge is investigated in this work. The impact of the dentate angle, the dentate spacing, and the dentate height are studied among the geometric parameters of the labyrinth channel. Numerical simulations of the water flow movement are performed according to central cubic composite design using Commercial codes GAMBIT and FLUENT. Inlet pressure of the dripper is set up to be 1 bar. The objective of this paper is to derive a mathematical model of the emitter discharge depending on the dentate angle, the dentate spacing, the dentate height of the labyrinth channel. As a result, the obtained mathematical model is a second-order polynomial reporting 2-way interactions among the geometric parameters. The dentate spacing has the most important and positive influence on the emitter discharge, followed by the simultaneous impact of the dentate spacing and the dentate height. The dentate angle in the observed interval has no significant effect on the emitter discharge. The obtained model can be used as a basis for a future emitter design.

Keywords: drip irrigation, labyrinth channel hydrodynamics, numerical simulations, Reynolds stress model.

Procedia PDF Downloads 164
28099 Bounds on the Laplacian Vertex PI Energy

Authors: Ezgi Kaya, A. Dilek Maden

Abstract:

A topological index is a number related to graph which is invariant under graph isomorphism. In theoretical chemistry, molecular structure descriptors (also called topological indices) are used for modeling physicochemical, pharmacologic, toxicologic, biological and other properties of chemical compounds. Let G be a graph with n vertices and m edges. For a given edge uv, the quantity nu(e) denotes the number of vertices closer to u than v, the quantity nv(e) is defined analogously. The vertex PI index defined as the sum of the nu(e) and nv(e). Here the sum is taken over all edges of G. The energy of a graph is defined as the sum of the eigenvalues of adjacency matrix of G and the Laplacian energy of a graph is defined as the sum of the absolute value of difference of laplacian eigenvalues and average degree of G. In theoretical chemistry, the π-electron energy of a conjugated carbon molecule, computed using the Hückel theory, coincides with the energy. Hence results on graph energy assume special significance. The Laplacian matrix of a graph G weighted by the vertex PI weighting is the Laplacian vertex PI matrix and the Laplacian vertex PI eigenvalues of a connected graph G are the eigenvalues of its Laplacian vertex PI matrix. In this study, Laplacian vertex PI energy of a graph is defined of G. We also give some bounds for the Laplacian vertex PI energy of graphs in terms of vertex PI index, the sum of the squares of entries in the Laplacian vertex PI matrix and the absolute value of the determinant of the Laplacian vertex PI matrix.

Keywords: energy, Laplacian energy, laplacian vertex PI eigenvalues, Laplacian vertex PI energy, vertex PI index

Procedia PDF Downloads 207