Search results for: Box and Jenkins Models
6599 Benchmarking Machine Learning Approaches for Forecasting Hotel Revenue
Authors: Rachel Y. Zhang, Christopher K. Anderson
Abstract:
A critical aspect of revenue management is a firm’s ability to predict demand as a function of price. Historically hotels have used simple time series models (regression and/or pick-up based models) owing to the complexities of trying to build casual models of demands. Machine learning approaches are slowly attracting attention owing to their flexibility in modeling relationships. This study provides an overview of approaches to forecasting hospitality demand – focusing on the opportunities created by machine learning approaches, including K-Nearest-Neighbors, Support vector machine, Regression Tree, and Artificial Neural Network algorithms. The out-of-sample performances of above approaches to forecasting hotel demand are illustrated by using a proprietary sample of the market level (24 properties) transactional data for Las Vegas NV. Causal predictive models can be built and evaluated owing to the availability of market level (versus firm level) data. This research also compares and contrast model accuracy of firm-level models (i.e. predictive models for hotel A only using hotel A’s data) to models using market level data (prices, review scores, location, chain scale, etc… for all hotels within the market). The prospected models will be valuable for hotel revenue prediction given the basic characters of a hotel property or can be applied in performance evaluation for an existed hotel. The findings will unveil the features that play key roles in a hotel’s revenue performance, which would have considerable potential usefulness in both revenue prediction and evaluation.Keywords: hotel revenue, k-nearest-neighbors, machine learning, neural network, prediction model, regression tree, support vector machine
Procedia PDF Downloads 1346598 Text Similarity in Vector Space Models: A Comparative Study
Authors: Omid Shahmirzadi, Adam Lugowski, Kenneth Younge
Abstract:
Automatic measurement of semantic text similarity is an important task in natural language processing. In this paper, we evaluate the performance of different vector space models to perform this task. We address the real-world problem of modeling patent-to-patent similarity and compare TFIDF (and related extensions), topic models (e.g., latent semantic indexing), and neural models (e.g., paragraph vectors). Contrary to expectations, the added computational cost of text embedding methods is justified only when: 1) the target text is condensed; and 2) the similarity comparison is trivial. Otherwise, TFIDF performs surprisingly well in other cases: in particular for longer and more technical texts or for making finer-grained distinctions between nearest neighbors. Unexpectedly, extensions to the TFIDF method, such as adding noun phrases or calculating term weights incrementally, were not helpful in our context.Keywords: big data, patent, text embedding, text similarity, vector space model
Procedia PDF Downloads 1766597 Geographic Information System for District Level Energy Performance Simulations
Authors: Avichal Malhotra, Jerome Frisch, Christoph van Treeck
Abstract:
The utilization of semantic, cadastral and topological data from geographic information systems (GIS) has exponentially increased for building and urban-scale energy performance simulations. Urban planners, simulation scientists, and researchers use virtual 3D city models for energy analysis, algorithms and simulation tools. For dynamic energy simulations at city and district level, this paper provides an overview of the available GIS data models and their levels of detail. Adhering to different norms and standards, these models also intend to describe building and construction industry data. For further investigations, CityGML data models are considered for simulations. Though geographical information modelling has considerably many different implementations, extensions of virtual city data can also be made for domain specific applications. Highlighting the use of the extended CityGML models for energy researches, a brief introduction to the Energy Application Domain Extension (ADE) along with its significance is made. Consequently, addressing specific input simulation data, a workflow using Modelica underlining the usage of GIS information and the quantification of its significance over annual heating energy demand is presented in this paper.Keywords: CityGML, EnergyADE, energy performance simulation, GIS
Procedia PDF Downloads 1726596 Talent-to-Vec: Using Network Graphs to Validate Models with Data Sparsity
Authors: Shaan Khosla, Jon Krohn
Abstract:
In a recruiting context, machine learning models are valuable for recommendations: to predict the best candidates for a vacancy, to match the best vacancies for a candidate, and compile a set of similar candidates for any given candidate. While useful to create these models, validating their accuracy in a recommendation context is difficult due to a sparsity of data. In this report, we use network graph data to generate useful representations for candidates and vacancies. We use candidates and vacancies as network nodes and designate a bi-directional link between them based on the candidate interviewing for the vacancy. After using node2vec, the embeddings are used to construct a validation dataset with a ranked order, which will help validate new recommender systems.Keywords: AI, machine learning, NLP, recruiting
Procedia PDF Downloads 876595 Bridging the Gap between Different Interfaces for Business Process Modeling
Authors: Katalina Grigorova, Kaloyan Mironov
Abstract:
The paper focuses on the benefits of business process modeling. Although this discipline is developing for many years, there is still necessity of creating new opportunities to meet the ever-increasing users’ needs. Because one of these needs is related to the conversion of business process models from one standard to another, the authors have developed a converter between BPMN and EPC standards using workflow patterns as intermediate tool. Nowadays there are too many systems for business process modeling. The variety of output formats is almost the same as the systems themselves. This diversity additionally hampers the conversion of the models. The presented study is aimed at discussing problems due to differences in the output formats of various modeling environments.Keywords: business process modeling, business process modeling standards, workflow patterns, converting models
Procedia PDF Downloads 5886594 Hybrid Project Management Model Based on Lean and Agile Approach
Authors: Fatima-Zahra Eddoug, Jamal Benhra, Rajaa Benabbou
Abstract:
Several project management models exist in the literature and the most used ones are the hybrids for their multiple advantages. Our objective in this paper is to analyze the existing models, which are based on the Lean and Agile approaches and to propose a novel framework with the convenient tools that will allow efficient management of a general project. To create the desired framework, we were based essentially on 7 existing models. Only the Scrum tool among the agile tools was identified by several authors to be appropriate for project management. In contrast, multiple lean tools were proposed in different phases of the project.Keywords: agility, hybrid project management, lean, scrum
Procedia PDF Downloads 1396593 Multiple Linear Regression for Rapid Estimation of Subsurface Resistivity from Apparent Resistivity Measurements
Authors: Sabiu Bala Muhammad, Rosli Saad
Abstract:
Multiple linear regression (MLR) models for fast estimation of true subsurface resistivity from apparent resistivity field measurements are developed and assessed in this study. The parameters investigated were apparent resistivity (ρₐ), horizontal location (X) and depth (Z) of measurement as the independent variables; and true resistivity (ρₜ) as the dependent variable. To achieve linearity in both resistivity variables, datasets were first transformed into logarithmic domain following diagnostic checks of normality of the dependent variable and heteroscedasticity to ensure accurate models. Four MLR models were developed based on hierarchical combination of the independent variables. The generated MLR coefficients were applied to another data set to estimate ρₜ values for validation. Contours of the estimated ρₜ values were plotted and compared to the observed data plots at the colour scale and blanking for visual assessment. The accuracy of the models was assessed using coefficient of determination (R²), standard error (SE) and weighted mean absolute percentage error (wMAPE). It is concluded that the MLR models can estimate ρₜ for with high level of accuracy.Keywords: apparent resistivity, depth, horizontal location, multiple linear regression, true resistivity
Procedia PDF Downloads 2786592 Evaluation of Newly Synthesized Steroid Derivatives Using In silico Molecular Descriptors and Chemometric Techniques
Authors: Milica Ž. Karadžić, Lidija R. Jevrić, Sanja Podunavac-Kuzmanović, Strahinja Z. Kovačević, Anamarija I. Mandić, Katarina Penov-Gaši, Andrea R. Nikolić, Aleksandar M. Oklješa
Abstract:
This study considered selection of the in silico molecular descriptors and the models for newly synthesized steroid derivatives description and their characterization using chemometric techniques. Multiple linear regression (MLR) models were established and gave the best molecular descriptors for quantitative structure-retention relationship (QSRR) modeling of the retention of the investigated molecules. MLR models were without multicollinearity among the selected molecular descriptors according to the variance inflation factor (VIF) values. Used molecular descriptors were ranked using generalized pair correlation method (GPCM). In this method, the significant difference between independent variables can be noticed regardless almost equal correlation between dependent variable. Generated MLR models were statistically and cross-validated and the best models were kept. Models were ranked using sum of ranking differences (SRD) method. According to this method, the most consistent QSRR model can be found and similarity or dissimilarity between the models could be noticed. In this study, SRD was performed using average values of experimentally observed data as a golden standard. Chemometric analysis was conducted in order to characterize newly synthesized steroid derivatives for further investigation regarding their potential biological activity and further synthesis. This article is based upon work from COST Action (CM1105), supported by COST (European Cooperation in Science and Technology).Keywords: generalized pair correlation method, molecular descriptors, regression analysis, steroids, sum of ranking differences
Procedia PDF Downloads 3486591 Estimating Lost Digital Video Frames Using Unidirectional and Bidirectional Estimation Based on Autoregressive Time Model
Authors: Navid Daryasafar, Nima Farshidfar
Abstract:
In this article, we make attempt to hide error in video with an emphasis on the time-wise use of autoregressive (AR) models. To resolve this problem, we assume that all information in one or more video frames is lost. Then, lost frames are estimated using analogous Pixels time information in successive frames. Accordingly, after presenting autoregressive models and how they are applied to estimate lost frames, two general methods are presented for using these models. The first method which is the same standard method of autoregressive models estimates lost frame in unidirectional form. Usually, in such condition, previous frames information is used for estimating lost frame. Yet, in the second method, information from the previous and next frames is used for estimating the lost frame. As a result, this method is known as bidirectional estimation. Then, carrying out a series of tests, performance of each method is assessed in different modes. And, results are compared.Keywords: error steganography, unidirectional estimation, bidirectional estimation, AR linear estimation
Procedia PDF Downloads 5416590 Evaluating Hyperelastic Properties of Geotextiles under Uniaxial Loading
Authors: Belhadj Fatma Zohra, Belhadj Ahmed Fouad, Chabaat Mohamed
Abstract:
The properties of geotextiles can impact the long-term behavior of reinforced soils, which can lead to unexpected problems such as instability and excessive deformation. Research into the material’s rheological properties and nonlinear behavior is required to overcome this issue. This study focuses on six isotropic hyperelastic models (Neo-Hooke, Mooney-Rivlin, Ogden, Yeoh, Arruda-Boyce, and Van der Waals) commonly used to describe the behavior of PET woven geotextiles in civil engineering applications. The models are adjusted for uniaxial tension testing in the warp and weft directions based on experimental data; the Yeoh and Neo-Hooke models accurately predict the behavior of these geotextiles. The study aims to enhance an understanding of how geotextiles behave under varying loads through testing and finite element simulations. The strong correlation between experimental and simulation results can help develop hyperelastic material models for geotextiles. This framework can be beneficial for manufacturers and engineers in addressing soil-structure interaction concerns effectively in their projects.Keywords: soil-structure interaction interface, geotextiles rheological characteristics, hyperelastic models, uniaxial tension testing, FEA modeling
Procedia PDF Downloads 86589 Validating Condition-Based Maintenance Algorithms through Simulation
Authors: Marcel Chevalier, Léo Dupont, Sylvain Marié, Frédérique Roffet, Elena Stolyarova, William Templier, Costin Vasile
Abstract:
Industrial end-users are currently facing an increasing need to reduce the risk of unexpected failures and optimize their maintenance. This calls for both short-term analysis and long-term ageing anticipation. At Schneider Electric, we tackle those two issues using both machine learning and first principles models. Machine learning models are incrementally trained from normal data to predict expected values and detect statistically significant short-term deviations. Ageing models are constructed by breaking down physical systems into sub-assemblies, then determining relevant degradation modes and associating each one to the right kinetic law. Validating such anomaly detection and maintenance models is challenging, both because actual incident and ageing data are rare and distorted by human interventions, and incremental learning depends on human feedback. To overcome these difficulties, we propose to simulate physics, systems, and humans -including asset maintenance operations- in order to validate the overall approaches in accelerated time and possibly choose between algorithmic alternatives.Keywords: degradation models, ageing, anomaly detection, soft sensor, incremental learning
Procedia PDF Downloads 1266588 Learning Predictive Models for Efficient Energy Management of Exhibition Hall
Authors: Jeongmin Kim, Eunju Lee, Kwang Ryel Ryu
Abstract:
This paper addresses the problem of predictive control for energy management of large-scaled exhibition halls, where a lot of energy is consumed to maintain internal atmosphere under certain required conditions. Predictive control achieves better energy efficiency by optimizing the operation of air-conditioning facilities with not only the current but also some future status taken into account. In this paper, we propose to use predictive models learned from past sensor data of hall environment, for use in optimizing the operating plan for the air-conditioning facilities by simulating future environmental change. We have implemented an emulator of an exhibition hall by using EnergyPlus, a widely used building energy emulation tool, to collect data for learning environment-change models. Experimental results show that the learned models predict future change highly accurately on a short-term basis.Keywords: predictive control, energy management, machine learning, optimization
Procedia PDF Downloads 2746587 Empirical Roughness Progression Models of Heavy Duty Rural Pavements
Authors: Nahla H. Alaswadko, Rayya A. Hassan, Bayar N. Mohammed
Abstract:
Empirical deterministic models have been developed to predict roughness progression of heavy duty spray sealed pavements for a dataset representing rural arterial roads. The dataset provides a good representation of the relevant network and covers a wide range of operating and environmental conditions. A sample with a large size of historical time series data for many pavement sections has been collected and prepared for use in multilevel regression analysis. The modelling parameters include road roughness as performance parameter and traffic loading, time, initial pavement strength, reactivity level of subgrade soil, climate condition, and condition of drainage system as predictor parameters. The purpose of this paper is to report the approaches adopted for models development and validation. The study presents multilevel models that can account for the correlation among time series data of the same section and to capture the effect of unobserved variables. Study results show that the models fit the data very well. The contribution and significance of relevant influencing factors in predicting roughness progression are presented and explained. The paper concludes that the analysis approach used for developing the models confirmed their accuracy and reliability by well-fitting to the validation data.Keywords: roughness progression, empirical model, pavement performance, heavy duty pavement
Procedia PDF Downloads 1686586 Wind Power Forecast Error Simulation Model
Authors: Josip Vasilj, Petar Sarajcev, Damir Jakus
Abstract:
One of the major difficulties introduced with wind power penetration is the inherent uncertainty in production originating from uncertain wind conditions. This uncertainty impacts many different aspects of power system operation, especially the balancing power requirements. For this reason, in power system development planing, it is necessary to evaluate the potential uncertainty in future wind power generation. For this purpose, simulation models are required, reproducing the performance of wind power forecasts. This paper presents a wind power forecast error simulation models which are based on the stochastic process simulation. Proposed models capture the most important statistical parameters recognized in wind power forecast error time series. Furthermore, two distinct models are presented based on data availability. First model uses wind speed measurements on potential or existing wind power plant locations, while the seconds model uses statistical distribution of wind speeds.Keywords: wind power, uncertainty, stochastic process, Monte Carlo simulation
Procedia PDF Downloads 4856585 A Comparative Study of Regional Climate Models and Global Coupled Models over Uttarakhand
Authors: Sudip Kumar Kundu, Charu Singh
Abstract:
As a great physiographic divide, the Himalayas affecting a large system of water and air circulation which helps to determine the climatic condition in the Indian subcontinent to the south and mid-Asian highlands to the north. It creates obstacles by defending chill continental air from north side into India in winter and also defends rain-bearing southwesterly monsoon to give up maximum precipitation in that area in monsoon season. Nowadays extreme weather conditions such as heavy precipitation, cloudburst, flash flood, landslide and extreme avalanches are the regular happening incidents in the region of North Western Himalayan (NWH). The present study has been planned to investigate the suitable model(s) to find out the rainfall pattern over that region. For this investigation, selected models from Coordinated Regional Climate Downscaling Experiment (CORDEX) and Coupled Model Intercomparison Project Phase 5 (CMIP5) has been utilized in a consistent framework for the period of 1976 to 2000 (historical). The ability of these driving models from CORDEX domain and CMIP5 has been examined according to their capability of the spatial distribution as well as time series plot of rainfall over NWH in the rainy season and compared with the ground-based Indian Meteorological Department (IMD) gridded rainfall data set. It is noted from the analysis that the models like MIROC5 and MPI-ESM-LR from the both CORDEX and CMIP5 provide the best spatial distribution of rainfall over NWH region. But the driving models from CORDEX underestimates the daily rainfall amount as compared to CMIP5 driving models as it is unable to capture daily rainfall data properly when it has been plotted for time series (TS) individually for the state of Uttarakhand (UK) and Himachal Pradesh (HP). So finally it can be said that the driving models from CMIP5 are better than CORDEX domain models to investigate the rainfall pattern over NWH region.Keywords: global warming, rainfall, CMIP5, CORDEX, NWH
Procedia PDF Downloads 1696584 Predicting Options Prices Using Machine Learning
Authors: Krishang Surapaneni
Abstract:
The goal of this project is to determine how to predict important aspects of options, including the ask price. We want to compare different machine learning models to learn the best model and the best hyperparameters for that model for this purpose and data set. Option pricing is a relatively new field, and it can be very complicated and intimidating, especially to inexperienced people, so we want to create a machine learning model that can predict important aspects of an option stock, which can aid in future research. We tested multiple different models and experimented with hyperparameter tuning, trying to find some of the best parameters for a machine-learning model. We tested three different models: a Random Forest Regressor, a linear regressor, and an MLP (multi-layer perceptron) regressor. The most important feature in this experiment is the ask price; this is what we were trying to predict. In the field of stock pricing prediction, there is a large potential for error, so we are unable to determine the accuracy of the models based on if they predict the pricing perfectly. Due to this factor, we determined the accuracy of the model by finding the average percentage difference between the predicted and actual values. We tested the accuracy of the machine learning models by comparing the actual results in the testing data and the predictions made by the models. The linear regression model performed worst, with an average percentage error of 17.46%. The MLP regressor had an average percentage error of 11.45%, and the random forest regressor had an average percentage error of 7.42%Keywords: finance, linear regression model, machine learning model, neural network, stock price
Procedia PDF Downloads 776583 The Martingale Options Price Valuation for European Puts Using Stochastic Differential Equation Models
Authors: H. C. Chinwenyi, H. D. Ibrahim, F. A. Ahmed
Abstract:
In modern financial mathematics, valuing derivatives such as options is often a tedious task. This is simply because their fair and correct prices in the future are often probabilistic. This paper examines three different Stochastic Differential Equation (SDE) models in finance; the Constant Elasticity of Variance (CEV) model, the Balck-Karasinski model, and the Heston model. The various Martingales option price valuation formulas for these three models were obtained using the replicating portfolio method. Also, the numerical solution of the derived Martingales options price valuation equations for the SDEs models was carried out using the Monte Carlo method which was implemented using MATLAB. Furthermore, results from the numerical examples using published data from the Nigeria Stock Exchange (NSE), all share index data show the effect of increase in the underlying asset value (stock price) on the value of the European Put Option for these models. From the results obtained, we see that an increase in the stock price yields a decrease in the value of the European put option price. Hence, this guides the option holder in making a quality decision by not exercising his right on the option.Keywords: equivalent martingale measure, European put option, girsanov theorem, martingales, monte carlo method, option price valuation formula
Procedia PDF Downloads 1356582 The Hyperbolic Smoothing Approach for Automatic Calibration of Rainfall-Runoff Models
Authors: Adilson Elias Xavier, Otto Corrêa Rotunno Filho, Paulo Canedo De Magalhães
Abstract:
This paper addresses the issue of automatic parameter estimation in conceptual rainfall-runoff (CRR) models. Due to threshold structures commonly occurring in CRR models, the associated mathematical optimization problems have the significant characteristic of being strongly non-differentiable. In order to face this enormous task, the resolution method proposed adopts a smoothing strategy using a special C∞ differentiable class function. The final estimation solution is obtained by solving a sequence of differentiable subproblems which gradually approach the original conceptual problem. The use of this technique, called Hyperbolic Smoothing Method (HSM), makes possible the application of the most powerful minimization algorithms, and also allows for the main difficulties presented by the original CRR problem to be overcome. A set of computational experiments is presented for the purpose of illustrating both the reliability and the efficiency of the proposed approach.Keywords: rainfall-runoff models, automatic calibration, hyperbolic smoothing method
Procedia PDF Downloads 1496581 Developing Location-allocation Models in the Three Echelon Supply Chain
Authors: Mehdi Seifbarghy, Zahra Mansouri
Abstract:
In this paper a few location-allocation models are developed in a multi-echelon supply chain including suppliers, manufacturers, distributors and retailers. The objectives are maximizing demand coverage, minimizing the total distance of distributors from suppliers, minimizing some facility establishment costs and minimizing the environmental effects. Since nature of the given models is multi-objective, we suggest a number of goal-based solution techniques such L-P metric, goal programming, multi-choice goal programming and goal attainment in order to solve the problems.Keywords: location, multi-echelon supply chain, covering, goal programming
Procedia PDF Downloads 5606580 A Machine Learning Model for Dynamic Prediction of Chronic Kidney Disease Risk Using Laboratory Data, Non-Laboratory Data, and Metabolic Indices
Authors: Amadou Wurry Jallow, Adama N. S. Bah, Karamo Bah, Shih-Ye Wang, Kuo-Chung Chu, Chien-Yeh Hsu
Abstract:
Chronic kidney disease (CKD) is a major public health challenge with high prevalence, rising incidence, and serious adverse consequences. Developing effective risk prediction models is a cost-effective approach to predicting and preventing complications of chronic kidney disease (CKD). This study aimed to develop an accurate machine learning model that can dynamically identify individuals at risk of CKD using various kinds of diagnostic data, with or without laboratory data, at different follow-up points. Creatinine is a key component used to predict CKD. These models will enable affordable and effective screening for CKD even with incomplete patient data, such as the absence of creatinine testing. This retrospective cohort study included data on 19,429 adults provided by a private research institute and screening laboratory in Taiwan, gathered between 2001 and 2015. Univariate Cox proportional hazard regression analyses were performed to determine the variables with high prognostic values for predicting CKD. We then identified interacting variables and grouped them according to diagnostic data categories. Our models used three types of data gathered at three points in time: non-laboratory, laboratory, and metabolic indices data. Next, we used subgroups of variables within each category to train two machine learning models (Random Forest and XGBoost). Our machine learning models can dynamically discriminate individuals at risk for developing CKD. All the models performed well using all three kinds of data, with or without laboratory data. Using only non-laboratory-based data (such as age, sex, body mass index (BMI), and waist circumference), both models predict chronic kidney disease as accurately as models using laboratory and metabolic indices data. Our machine learning models have demonstrated the use of different categories of diagnostic data for CKD prediction, with or without laboratory data. The machine learning models are simple to use and flexible because they work even with incomplete data and can be applied in any clinical setting, including settings where laboratory data is difficult to obtain.Keywords: chronic kidney disease, glomerular filtration rate, creatinine, novel metabolic indices, machine learning, risk prediction
Procedia PDF Downloads 1066579 The Involvement of Visual and Verbal Representations Within a Quantitative and Qualitative Visual Change Detection Paradigm
Authors: Laura Jenkins, Tim Eschle, Joanne Ciafone, Colin Hamilton
Abstract:
An original working memory model suggested the separation of visual and verbal systems in working memory architecture, in which only visual working memory components were used during visual working memory tasks. It was later suggested that the visuo spatial sketch pad was the only memory component at use during visual working memory tasks, and components such as the phonological loop were not considered. In more recent years, a contrasting approach has been developed with the use of an executive resource to incorporate both visual and verbal representations in visual working memory paradigms. This was supported using research demonstrating the use of verbal representations and an executive resource in a visual matrix patterns task. The aim of the current research is to investigate the working memory architecture during both a quantitative and a qualitative visual working memory task. A dual task method will be used. Three secondary tasks will be used which are designed to hit specific components within the working memory architecture – Dynamic Visual Noise (visual components), Visual Attention (spatial components) and Verbal Attention (verbal components). A comparison of the visual working memory tasks will be made to discover if verbal representations are at use, as the previous literature suggested. This direct comparison has not been made so far in the literature. Considerations will be made as to whether a domain specific approach should be employed when discussing visual working memory tasks, or whether a more domain general approach could be used instead.Keywords: semantic organisation, visual memory, change detection
Procedia PDF Downloads 5966578 Intensive Use of Software in Teaching and Learning Calculus
Authors: Nodelman V.
Abstract:
Despite serious difficulties in the assimilation of the conceptual system of Calculus, software in the educational process is used only occasionally, and even then, mainly for illustration purposes. The following are a few reasons: The non-trivial nature of the studied material, Lack of skills in working with software, Fear of losing time working with software, The variety of the software itself, the corresponding interface, syntax, and the methods of working with the software, The need to find suitable models, and familiarize yourself with working with them, Incomplete compatibility of the found models with the content and teaching methods of the studied material. This paper proposes an active use of the developed non-commercial software VusuMatica, which allows removing these restrictions through Broad support for the studied mathematical material (and not only Calculus). As a result - no need to select the right software, Emphasizing the unity of mathematics, its intrasubject and interdisciplinary relations, User-friendly interface, Absence of special syntax in defining mathematical objects, Ease of building models of the studied material and manipulating them, Unlimited flexibility of models thanks to the ability to redefine objects, which allows exploring objects characteristics, and considering examples and counterexamples of the concepts under study. The construction of models is based on an original approach to the analysis of the structure of the studied concepts. Thanks to the ease of construction, students are able not only to use ready-made models but also to create them on their own and explore the material studied with their help. The presentation includes examples of using VusuMatica in studying the concepts of limit and continuity of a function, its derivative, and integral.Keywords: counterexamples, limitations and requirements, software, teaching and learning calculus, user-friendly interface and syntax
Procedia PDF Downloads 836577 Sleep Quality and Burnout, Mental and Physical Health of Polish Healthcare Workers
Authors: Maciej Bialorudzki, Zbigniew Izdebski, Alicja Kozakiewicz, Joanna Mazur
Abstract:
The quality of sleep is extremely important for physical and mental health, especially among professional groups exposed to the suffering of the people they serve. The aim of the study is to assess sleep quality and various aspects of physical and mental health. A nationwide cross-sectional survey conducted in the first quarter of 2022 included 2227 healthcare professionals from 114 Polish hospitals and specialized outpatient clinics. The following distribution for each professional group was obtained (22% doctors; 52.6% nurses; 7.3% paramedics; 10.1% other medical professionals; 7.9% other non-medical professionals). The mean age of the respondents was 46.24 (SD=11.53). The Jenkins Sleep Scale with four items (JSS-4) was used to assess sleep quality, yielding a mean value of 5.35 (SD=5.20) in the study group and 13.7% of subjects with poor sleep quality using the cutoff point of the mean JSS-4 sum score as >11. More often, women than men reported poorer sleep quality (14,8% vs. 9,1% p=0,002). Respondents with poor sleep quality were more likely to report occupational burnout as measured by the BAT-12 (43.1% vs. 12.9% p<0.001) and high levels of stress as measured by the PSS-4 (72.5% vs. 27.5% p<0.001). In addition, those who declare experiencing a traumatic event compared to those who have not experienced it has an almost two times higher risk of poorer sleep quality (OR:1.958; 95% CI:1.509-2.542; p<0.001). In contrast, those with occupational burnout had more than five times the risk of those without occupational burnout (OR:5.092; 95% CI: 3.763-6.889; p<0.001). Sleep quality remains an important predictor of stress levels, job burnout, and quality of life assessment.Keywords: quality of sleep, medical staff, mental health, physical health, occupational burnout, stress
Procedia PDF Downloads 746576 Nanoparticles on Biological Biomarquers Models: Paramecium Tetraurelia and Helix aspersa
Authors: H. Djebar, L. Khene, M. Boucenna, M. R. Djebar, M. N. Khebbeb, M. Djekoun
Abstract:
Currently in toxicology, use of alternative models permits to understand the mechanisms of toxicity at different levels of cells. Objectives of our research concern the determination of NPs ZnO, TiO2, AlO2, and FeO2 effect on ciliate protist freshwater Paramecium sp and Helix aspersa. The result obtained show that NPs increased antioxidative enzyme activity like catalase, glutathione –S-transferase and level GSH. Also, cells treated with high concentrations of NPs showed a high level of MDA. In conclusion, observations from growth and enzymatic parameters suggest on one hand that treatment with NPs provokes an oxidative stress and on the other that snale and paramecium are excellent alternatives models for ecotoxicological studies.Keywords: NPs, GST, catalase, GSH, MDA, toxicity, snale and paramecium
Procedia PDF Downloads 2836575 A Large Language Model-Driven Method for Automated Building Energy Model Generation
Authors: Yake Zhang, Peng Xu
Abstract:
The development of building energy models (BEM) required for architectural design and analysis is a time-consuming and complex process, demanding a deep understanding and proficient use of simulation software. To streamline the generation of complex building energy models, this study proposes an automated method for generating building energy models using a large language model and the BEM library aimed at improving the efficiency of model generation. This method leverages a large language model to parse user-specified requirements for target building models, extracting key features such as building location, window-to-wall ratio, and thermal performance of the building envelope. The BEM library is utilized to retrieve energy models that match the target building’s characteristics, serving as reference information for the large language model to enhance the accuracy and relevance of the generated model, allowing for the creation of a building energy model that adapts to the user’s modeling requirements. This study enables the automatic creation of building energy models based on natural language inputs, reducing the professional expertise required for model development while significantly decreasing the time and complexity of manual configuration. In summary, this study provides an efficient and intelligent solution for building energy analysis and simulation, demonstrating the potential of a large language model in the field of building simulation and performance modeling.Keywords: artificial intelligence, building energy modelling, building simulation, large language model
Procedia PDF Downloads 286574 A Novel Algorithm for Parsing IFC Models
Authors: Raninder Kaur Dhillon, Mayur Jethwa, Hardeep Singh Rai
Abstract:
Information technology has made a pivotal progress across disparate disciplines, one of which is AEC (Architecture, Engineering and Construction) industry. CAD is a form of computer-aided building modulation that architects, engineers and contractors use to create and view two- and three-dimensional models. The AEC industry also uses building information modeling (BIM), a newer computerized modeling system that can create four-dimensional models; this software can greatly increase productivity in the AEC industry. BIM models generate open source IFC (Industry Foundation Classes) files which aim for interoperability for exchanging information throughout the project lifecycle among various disciplines. The methods developed in previous studies require either an IFC schema or MVD and software applications, such as an IFC model server or a Building Information Modeling (BIM) authoring tool, to extract a partial or complete IFC instance model. This paper proposes an efficient algorithm for extracting a partial and total model from an Industry Foundation Classes (IFC) instance model without an IFC schema or a complete IFC model view definition (MVD). Procedia PDF Downloads 3006573 Forecasting Performance Comparison of Autoregressive Fractional Integrated Moving Average and Jordan Recurrent Neural Network Models on the Turbidity of Stream Flows
Authors: Daniel Fulus Fom, Gau Patrick Damulak
Abstract:
In this study, the Autoregressive Fractional Integrated Moving Average (ARFIMA) and Jordan Recurrent Neural Network (JRNN) models were employed to model the forecasting performance of the daily turbidity flow of White Clay Creek (WCC). The two methods were applied to the log difference series of the daily turbidity flow series of WCC. The measurements of error employed to investigate the forecasting performance of the ARFIMA and JRNN models are the Root Mean Square Error (RMSE) and the Mean Absolute Error (MAE). The outcome of the investigation revealed that the forecasting performance of the JRNN technique is better than the forecasting performance of the ARFIMA technique in the mean square error sense. The results of the ARFIMA and JRNN models were obtained by the simulation of the models using MATLAB version 8.03. The significance of using the log difference series rather than the difference series is that the log difference series stabilizes the turbidity flow series than the difference series on the ARFIMA and JRNN.Keywords: auto regressive, mean absolute error, neural network, root square mean error
Procedia PDF Downloads 2686572 Preliminary Conceptions of 3D Prototyping Model to Experimental Investigation in Hypersonic Shock Tunnels
Authors: Thiago Victor Cordeiro Marcos, Joao Felipe de Araujo Martos, Ronaldo de Lima Cardoso, David Romanelli Pinto, Paulo Gilberto de Paula Toro, Israel da Silveira Rego, Antonio Carlos de Oliveira
Abstract:
Currently, the use of 3D rapid prototyping, also known as 3D printing, has been investigated by some universities around the world as an innovative technique, fast, flexible and cheap for a direct plastic models manufacturing that are lighter and with complex geometries to be tested for hypersonic shock tunnel. Initially, the purpose is integrated prototyped parts with metal models that actually are manufactured through of the conventional machining and hereafter replace them with completely prototyped models. The mechanical design models to be tested in hypersonic shock tunnel are based on conventional manufacturing processes, therefore are limited forms and standard geometries. The use of 3D rapid prototyping offers a range of options that enables geometries innovation and ways to be used for the design new models. The conception and project of a prototyped model for hypersonic shock tunnel should be rethought and adapted when comparing the conventional manufacturing processes, in order to fully exploit the creativity and flexibility that are allowed by the 3D prototyping process. The objective of this paper is to compare the conception and project of a 3D rapid prototyping model and a conventional machining model, while showing the advantages and disadvantages of each process and the benefits that 3D prototyping can bring to the manufacture of models to be tested in hypersonic shock tunnel.Keywords: 3D printing, 3D prototyping, experimental research, hypersonic shock tunnel
Procedia PDF Downloads 4706571 Neural Machine Translation for Low-Resource African Languages: Benchmarking State-of-the-Art Transformer for Wolof
Authors: Cheikh Bamba Dione, Alla Lo, Elhadji Mamadou Nguer, Siley O. Ba
Abstract:
In this paper, we propose two neural machine translation (NMT) systems (French-to-Wolof and Wolof-to-French) based on sequence-to-sequence with attention and transformer architectures. We trained our models on a parallel French-Wolof corpus of about 83k sentence pairs. Because of the low-resource setting, we experimented with advanced methods for handling data sparsity, including subword segmentation, back translation, and the copied corpus method. We evaluate the models using the BLEU score and find that transformer outperforms the classic seq2seq model in all settings, in addition to being less sensitive to noise. In general, the best scores are achieved when training the models on word-level-based units. For subword-level models, using back translation proves to be slightly beneficial in low-resource (WO) to high-resource (FR) language translation for the transformer (but not for the seq2seq) models. A slight improvement can also be observed when injecting copied monolingual text in the target language. Moreover, combining the copied method data with back translation leads to a substantial improvement of the translation quality.Keywords: backtranslation, low-resource language, neural machine translation, sequence-to-sequence, transformer, Wolof
Procedia PDF Downloads 1476570 The Influence of Contact Models on Discrete Element Modeling of the Ballast Layer Subjected to Cyclic Loading
Authors: Peyman Aela, Lu Zong, Guoqing Jing
Abstract:
Recently, there has been growing interest in numerical modeling of ballast railway tracks. A commonly used mechanistic modeling approach for ballast is the discrete element method (DEM). Up to now, the effects of the contact model on ballast particle behavior have not been precisely examined. In this regard, selecting the appropriate contact model is mainly associated with the particle characteristics and the loading condition. Since ballast is cohesionless material, different contact models, including the linear spring, Hertz-Mindlin, and Hysteretic models, could be used to calculate particle-particle or wall-particle contact forces. Moreover, the simulation of a dynamic test is vital to investigate the effect of damping parameters on the ballast deformation. In this study, ballast box tests were simulated by DEM to examine the influence of different contact models on the mechanical behavior of the ballast layer under cyclic loading. This paper shows how the contact model can affect the deformation and damping of a ballast layer subjected to cyclic loading in a ballast box.Keywords: ballast, contact model, cyclic loading, DEM
Procedia PDF Downloads 198