Search results for: atomic models
7169 A Comparative Analysis of the Performance of COSMO and WRF Models in Quantitative Rainfall Prediction
Authors: Isaac Mugume, Charles Basalirwa, Daniel Waiswa, Mary Nsabagwa, Triphonia Jacob Ngailo, Joachim Reuder, Sch¨attler Ulrich, Musa Semujju
Abstract:
The Numerical weather prediction (NWP) models are considered powerful tools for guiding quantitative rainfall prediction. A couple of NWP models exist and are used at many operational weather prediction centers. This study considers two models namely the Consortium for Small–scale Modeling (COSMO) model and the Weather Research and Forecasting (WRF) model. It compares the models’ ability to predict rainfall over Uganda for the period 21st April 2013 to 10th May 2013 using the root mean square (RMSE) and the mean error (ME). In comparing the performance of the models, this study assesses their ability to predict light rainfall events and extreme rainfall events. All the experiments used the default parameterization configurations and with same horizontal resolution (7 Km). The results show that COSMO model had a tendency of largely predicting no rain which explained its under–prediction. The COSMO model (RMSE: 14.16; ME: -5.91) presented a significantly (p = 0.014) higher magnitude of error compared to the WRF model (RMSE: 11.86; ME: -1.09). However the COSMO model (RMSE: 3.85; ME: 1.39) performed significantly (p = 0.003) better than the WRF model (RMSE: 8.14; ME: 5.30) in simulating light rainfall events. All the models under–predicted extreme rainfall events with the COSMO model (RMSE: 43.63; ME: -39.58) presenting significantly higher error magnitudes than the WRF model (RMSE: 35.14; ME: -26.95). This study recommends additional diagnosis of the models’ treatment of deep convection over the tropics.Keywords: comparative performance, the COSMO model, the WRF model, light rainfall events, extreme rainfall events
Procedia PDF Downloads 2607168 Theoretical Study of Electronic Structure of Erbium (Er), Fermium (Fm), and Nobelium (No)
Authors: Saleh O. Allehabi, V. A. Dzubaa, V. V. Flambaum, Jiguang Li, A. V. Afanasjev, S. E. Agbemava
Abstract:
Recently developed versions of the configuration method for open shells, configuration interaction with perturbation theory (CIPT), and configuration interaction with many-body perturbation theory (CI+MBPT) techniques are used to study the electronic structure of Er, Fm, and No atoms. Excitation energies of odd states connected to the even ground state by electric dipole transitions, the corresponding transition rates, isotope shift, hyperfine structure, ionization potentials, and static scalar polarizabilities are calculated. The way of extracting parameters of nuclear charge distribution beyond nuclear root mean square (RMS) radius, e.g., a parameter of quadrupole deformation β, is demonstrated. In nuclei with spin > 1/2, parameter β is extracted from the quadrupole hyperfine structure. With zero nuclear spin or spin 1/2, it is impossible since quadrupole zero, so a different method was developed. The measurements of at least two atomic transitions are needed to disentangle the contributions of the changes in deformation and nuclear RMS radius into field isotopic shift. This is important for testing nuclear theory and for searching for the hypothetical island of stability. Fm and No are heavy elements approaching the superheavy region, for which the experimental data are very poor, only seven lines for the Fm element and one line for the No element. Since Er and Fm have similar electronic structures, calculations for Er serve as a guide to the accuracy of the calculations. Twenty-eight new levels of Fm atom are reported.Keywords: atomic spectra, electronic transitions, isotope effect, electron correlation calculations for atoms
Procedia PDF Downloads 1557167 From Problem Space to Executional Architecture: The Development of a Simulator to Examine the Effect of Autonomy on Mainline Rail Capacity
Authors: Emily J. Morey, Kevin Galvin, Thomas Riley, R. Eddie Wilson
Abstract:
The key challenges faced by integrating autonomous rail operations into the existing mainline railway environment have been identified through the understanding and framing of the problem space and stakeholder analysis. This was achieved through the completion of the first four steps of Soft Systems Methodology, where the problem space has been expressed via conceptual models. Having identified these challenges, we investigated one of them, namely capacity, via the use of models and simulation. This paper examines the approach used to move from the conceptual models to a simulation which can determine whether the integration of autonomous trains can plausibly increase capacity. Within this approach, we developed an architecture and converted logical models into physical resource models and associated design features which were used to build a simulator. From this simulator, we are able to analyse mixtures of legacy-autonomous operations and produce fundamental diagrams and trajectory plots to describe the dynamic behaviour of mixed mainline railway operations.Keywords: autonomy, executable architecture, modelling and simulation, railway capacity
Procedia PDF Downloads 817166 Initial Concept of Islamic Social Entrepreneurship: Identification of Research Gap from Existing Model
Authors: Mohd Adib Abd Muin
Abstract:
Social entrepreneurship has become a new phenomenon in a country in order to reduce social problems and eradicate poverty communities. However, the study based on Islamic social entrepreneurship from the social entrepreneurial activity is still new especially in the Islamic perspective. In addition, this research found that is lacking of model on social entrepreneurship that focus on Islamic perspective. Therefore, the objective of this paper is to identify the issues and research gap based on Islamic perspective from existing models and to develop a concept of Islamic social entrepreneurship according to Islamic perspective and Maqasid Shari’ah. The research method used in this study is literature review and comparative analysis from 11 existing models of social entrepreneurship. The research finding shows that 11 existing models on social entrepreneurship has been analyzed and it shows that the existing models on social entrepreneurship do not emphasize on Islamic perspective.Keywords: component, social entrepreneurship, Islamic perspective, research gap
Procedia PDF Downloads 4487165 Principal Component Analysis Combined Machine Learning Techniques on Pharmaceutical Samples by Laser Induced Breakdown Spectroscopy
Authors: Kemal Efe Eseller, Göktuğ Yazici
Abstract:
Laser-induced breakdown spectroscopy (LIBS) is a rapid optical atomic emission spectroscopy which is used for material identification and analysis with the advantages of in-situ analysis, elimination of intensive sample preparation, and micro-destructive properties for the material to be tested. LIBS delivers short pulses of laser beams onto the material in order to create plasma by excitation of the material to a certain threshold. The plasma characteristics, which consist of wavelength value and intensity amplitude, depends on the material and the experiment’s environment. In the present work, medicine samples’ spectrum profiles were obtained via LIBS. Medicine samples’ datasets include two different concentrations for both paracetamol based medicines, namely Aferin and Parafon. The spectrum data of the samples were preprocessed via filling outliers based on quartiles, smoothing spectra to eliminate noise and normalizing both wavelength and intensity axis. Statistical information was obtained and principal component analysis (PCA) was incorporated to both the preprocessed and raw datasets. The machine learning models were set based on two different train-test splits, which were 70% training – 30% test and 80% training – 20% test. Cross-validation was preferred to protect the models against overfitting; thus the sample amount is small. The machine learning results of preprocessed and raw datasets were subjected to comparison for both splits. This is the first time that all supervised machine learning classification algorithms; consisting of Decision Trees, Discriminant, naïve Bayes, Support Vector Machines (SVM), k-NN(k-Nearest Neighbor) Ensemble Learning and Neural Network algorithms; were incorporated to LIBS data of paracetamol based pharmaceutical samples, and their different concentrations on preprocessed and raw dataset in order to observe the effect of preprocessing.Keywords: machine learning, laser-induced breakdown spectroscopy, medicines, principal component analysis, preprocessing
Procedia PDF Downloads 867164 Air Quality Analysis Using Machine Learning Models Under Python Environment
Authors: Salahaeddine Sbai
Abstract:
Air quality analysis using machine learning models is a method employed to assess and predict air pollution levels. This approach leverages the capabilities of machine learning algorithms to analyze vast amounts of air quality data and extract valuable insights. By training these models on historical air quality data, they can learn patterns and relationships between various factors such as weather conditions, pollutant emissions, and geographical features. The trained models can then be used to predict air quality levels in real-time or forecast future pollution levels. This application of machine learning in air quality analysis enables policymakers, environmental agencies, and the general public to make informed decisions regarding health, environmental impact, and mitigation strategies. By understanding the factors influencing air quality, interventions can be implemented to reduce pollution levels, mitigate health risks, and enhance overall air quality management. Climate change is having significant impacts on Morocco, affecting various aspects of the country's environment, economy, and society. In this study, we use some machine learning models under python environment to predict and analysis air quality change over North of Morocco to evaluate the climate change impact on agriculture.Keywords: air quality, machine learning models, pollution, pollutant emissions
Procedia PDF Downloads 917163 A Review of Literature on Theories of Construction Accident Causation Models
Authors: Samuel Opeyemi Williams, Razali Bin Adul Hamid, M. S. Misnan, Taki Eddine Seghier, D. I. Ajayi
Abstract:
Construction sites are characterized with occupational risks. Review of literature on construction accidents reveals that a lot of theories have been propounded over the years by different theorists, coupled with multifarious models developed by different proponents at different times. Accidents are unplanned events that are prominent in construction sites, involving materials, objects and people with attendant damages, loses and injuries. Models were developed to investigate the causations of accident with the aim of preventing its occurrence. Though, some of these theories were criticized, most especially, the Heinrich Domino theory, being mostly faulted for placing much blame on operatives rather than the management. The purpose of this paper is to unravel the significant construction accident causation theories and models for the benefit of understanding of the theories, and consequently enabling construction stakeholders identify the possible potential hazards on construction sites, as all stakeholders have significant roles to play in preventing accident. Accidents are preventable; hence, understanding the risk factors of accident and the causation theories paves way for its prevention. However, findings reveal that still some gaps missing in the existing models, while it is recommended that further research can be made in order to develop more models in order to maintain zero accident on construction sites.Keywords: domino theory, construction site, site safety, accident causation model
Procedia PDF Downloads 3027162 Modelling and Simulation of Diffusion Effect on the Glycol Dehydration Unit of a Natural Gas Plant
Authors: M. Wigwe, J. G Akpa, E. N Wami
Abstract:
Mathematical models of the absorber of a glycol dehydration facility was developed using the principles of conservation of mass and energy. Models which predict variation of the water content of gas in mole fraction, variation of gas and liquid temperatures across the parking height were developed. These models contain contributions from bulk and diffusion flows. The effect of diffusion on the process occurring in the absorber was studied in this work. The models were validated using the initial conditions in the plant data from Company W TEG unit in Nigeria. The results obtained showed that the effect of diffusion was noticed between z=0 and z=0.004 m. A deviation from plant data of 0% was observed for the gas water content at a residence time of 20 seconds, at z=0.004 m. Similarly, deviations of 1.584% and 2.844% were observed for the gas and TEG temperatures.Keywords: separations, absorption, simulation, dehydration, water content, triethylene glycol
Procedia PDF Downloads 4977161 Comparative Study of Experimental and Theoretical Convective, Evaporative for Two Model Distiller
Authors: Khaoula Hidouri, Ali Benhmidene, Bechir Chouachi
Abstract:
The purification of brackish seawater becomes a necessity and not a choice against demographic and industrial growth especially in third world countries. Two models can be used in this work: simple solar still and simple solar still coupled with a heat pump. In this research, the productivity of water by Simple Solar Distiller (SSD) and Simple Solar Distiller Hybrid Heat Pump (SSDHP) was determined by the orientation, the use of heat pump, the simple or double glass cover. The productivity can exceed 1.2 L/m²h for the SSDHP and 0.5 L/m²h for SSD model. The result of the global efficiency is determined for two models SSD and SSDHP give respectively 30%, 50%. The internal efficiency attained 35% for SSD and 60% of the SSDHP models. Convective heat coefficient can be determined by attained 2.5 W/m²°C and 0.5 W/m²°C respectively for SSDHP and SSD models.Keywords: productivity, efficiency, convective heat coefficient, SSD model, SSDHPmodel
Procedia PDF Downloads 2117160 Integrated Models of Reading Comprehension: Understanding to Impact Teaching—The Teacher’s Central Role
Authors: Sally A. Brown
Abstract:
Over the last 30 years, researchers have developed models or frameworks to provide a more structured understanding of the reading comprehension process. Cognitive information processing models and social cognitive theories both provide frameworks to inform reading comprehension instruction. The purpose of this paper is to (a) provide an overview of the historical development of reading comprehension theory, (b) review the literature framed by cognitive information processing, social cognitive, and integrated reading comprehension theories, and (c) demonstrate how these frameworks inform instruction. As integrated models of reading can guide the interpretation of various factors related to student learning, an integrated framework designed by the researcher will be presented. Results indicated that features of cognitive processing and social cognitivism theory—represented in the integrated framework—highlight the importance of the role of the teacher. This model can aid teachers in not only improving reading comprehension instruction but in identifying areas of challenge for students.Keywords: explicit instruction, integrated models of reading comprehension, reading comprehension, teacher’s role
Procedia PDF Downloads 967159 Hydrological Modeling of Watersheds Using the Only Corresponding Competitor Method: The Case of M’Zab Basin, South East Algeria
Authors: Oulad Naoui Noureddine, Cherif ELAmine, Djehiche Abdelkader
Abstract:
Water resources management includes several disciplines; the modeling of rainfall-runoff relationship is the most important discipline to prevent natural risks. There are several models to study rainfall-runoff relationship in watersheds. However, the majority of these models are not applicable in all basins of the world. In this study, a new stochastic method called The Only Corresponding Competitor method (OCC) was used for the hydrological modeling of M’ZAB Watershed (South East of Algeria) to adapt a few empirical models for any hydrological regime. The results obtained allow to authorize a certain number of visions, in which it would be interesting to experiment with hydrological models that improve collectively or separately the data of a catchment by the OCC method.Keywords: modelling, optimization, rainfall-runoff relationship, empirical model, OCC
Procedia PDF Downloads 2627158 Lumped Parameter Models for Numerical Simulation of The Dynamic Response of Hoisting Appliances
Authors: Candida Petrogalli, Giovanni Incerti, Luigi Solazzi
Abstract:
This paper describes three lumped parameters models for the study of the dynamic behaviour of a boom crane. The models proposed here allow evaluating the fluctuations of the load arising from the rope and structure elasticity and from the type of the motion command imposed by the winch. A calculation software was developed in order to determine the actual acceleration of the lifted mass and the dynamic overload during the lifting phase. Some application examples are presented, with the aim of showing the correlation between the magnitude of the stress and the type of the employed motion command.Keywords: crane, dynamic model, overloading condition, vibration
Procedia PDF Downloads 5737157 Effect of the Deposition Time of Hydrogenated Nanocrystalline Si Grown on Porous Alumina Film on Glass Substrate by Plasma Processing Chemical Vapor Deposition
Authors: F. Laatar, S. Ktifa, H. Ezzaouia
Abstract:
Plasma Enhanced Chemical Vapor Deposition (PECVD) method is used to deposit hydrogenated nanocrystalline silicon films (nc-Si: H) on Porous Anodic Alumina Films (PAF) on glass substrate at different deposition duration. Influence of the deposition time on the physical properties of nc-Si: H grown on PAF was investigated through an extensive correlation between micro-structural and optical properties of these films. In this paper, we present an extensive study of the morphological, structural and optical properties of these films by Atomic Force Microscopy (AFM), X-Ray Diffraction (XRD) techniques and a UV-Vis-NIR spectrometer. It was found that the changes in DT can modify the films thickness, the surface roughness and eventually improve the optical properties of the composite. Optical properties (optical thicknesses, refractive indexes (n), absorption coefficients (α), extinction coefficients (k), and the values of the optical transitions EG) of this kind of samples were obtained using the data of the transmittance T and reflectance R spectra’s recorded by the UV–Vis–NIR spectrometer. We used Cauchy and Wemple–DiDomenico models for the analysis of the dispersion of the refractive index and the determination of the optical properties of these films.Keywords: hydragenated nanocrystalline silicon, plasma processing chemical vapor deposition, X-ray diffraction, optical properties
Procedia PDF Downloads 3757156 Advances in Artificial intelligence Using Speech Recognition
Authors: Khaled M. Alhawiti
Abstract:
This research study aims to present a retrospective study about speech recognition systems and artificial intelligence. Speech recognition has become one of the widely used technologies, as it offers great opportunity to interact and communicate with automated machines. Precisely, it can be affirmed that speech recognition facilitates its users and helps them to perform their daily routine tasks, in a more convenient and effective manner. This research intends to present the illustration of recent technological advancements, which are associated with artificial intelligence. Recent researches have revealed the fact that speech recognition is found to be the utmost issue, which affects the decoding of speech. In order to overcome these issues, different statistical models were developed by the researchers. Some of the most prominent statistical models include acoustic model (AM), language model (LM), lexicon model, and hidden Markov models (HMM). The research will help in understanding all of these statistical models of speech recognition. Researchers have also formulated different decoding methods, which are being utilized for realistic decoding tasks and constrained artificial languages. These decoding methods include pattern recognition, acoustic phonetic, and artificial intelligence. It has been recognized that artificial intelligence is the most efficient and reliable methods, which are being used in speech recognition.Keywords: speech recognition, acoustic phonetic, artificial intelligence, hidden markov models (HMM), statistical models of speech recognition, human machine performance
Procedia PDF Downloads 4757155 Optimization and Simulation Models Applied in Engineering Planning and Management
Authors: Abiodun Ladanu Ajala, Wuyi Oke
Abstract:
Mathematical simulation and optimization models packaged within interactive computer programs provide a common way for planners and managers to predict the behaviour of any proposed water resources system design or management policy before it is implemented. Modeling presents a principal technique of predicting the behaviour of the proposed infrastructural designs or management policies. Models can be developed and used to help identify specific alternative plans that best meet those objectives. This study discusses various types of models, their development, architecture, data requirements, and applications in the field of engineering. It also outlines the advantages and limitations of each the optimization and simulation models presented. The techniques explored in this review include; dynamic programming, linear programming, fuzzy optimization, evolutionary algorithms and finally artificial intelligence techniques. Previous studies carried out using some of the techniques mentioned above were reviewed, and most of the results from different researches showed that indeed optimization and simulation provides viable alternatives and predictions which form a basis for decision making in building engineering structures and also in engineering planning and management.Keywords: linear programming, mutation, optimization, simulation
Procedia PDF Downloads 5887154 Return to Work after a Mental Health Problem: Analysis of Two Different Management Models
Authors: Lucie Cote, Sonia McFadden
Abstract:
Mental health problems in the workplace are currently one of the main causes of absences. Research work has highlighted the importance of a collaborative process involving the stakeholders in the return-to-work process and has established the best management practices to ensure a successful return-to-work. However, very few studies have specifically explored the combination of various management models and determined whether they could satisfy the needs of the stakeholders. The objective of this study is to analyze two models for managing the return to work: the ‘medical-administrative’ and the ‘support of the worker’ in order to understand the actions and actors involved in these models. The study also aims to explore whether these models meet the needs of the actors involved in the management of the return to work. A qualitative case study was conducted in a Canadian federal organization. An abundant internal documentation and semi-directed interviews with six managers, six workers and four human resources professionals involved in the management of records of employees returning to work after a mental health problem resulted in a complete picture of the return to work management practices used in this organization. The triangulation of this data facilitated the examination of the benefits and limitations of each approach. The results suggest that the actions of management for employee return to work from both models of management ‘support of the worker’ and ‘medical-administrative’ are compatible and can meet the needs of the actors involved in the return to work. More research is needed to develop a structured model integrating best practices of the two approaches to ensure the success of the return to work.Keywords: return to work, mental health, management models, organizations
Procedia PDF Downloads 2117153 Effect of Traffic Volume and Its Composition on Vehicular Speed under Mixed Traffic Conditions: A Kriging Based Approach
Authors: Subhadip Biswas, Shivendra Maurya, Satish Chandra, Indrajit Ghosh
Abstract:
Use of speed prediction models sometimes appears as a feasible alternative to laborious field measurement particularly, in case when field data cannot fulfill designer’s requirements. However, developing speed models is a challenging task specifically in the context of developing countries like India where vehicles with diverse static and dynamic characteristics use the same right of way without any segregation. Here the traffic composition plays a significant role in determining the vehicular speed. The present research was carried out to examine the effects of traffic volume and its composition on vehicular speed under mixed traffic conditions. Classified traffic volume and speed data were collected from different geometrically identical six lane divided arterials in New Delhi. Based on these field data, speed prediction models were developed for individual vehicle category adopting Kriging approximation technique, an alternative for commonly used regression. These models are validated with the data set kept aside earlier for validation purpose. The predicted speeds showed a great deal of agreement with the observed values and also the model outperforms all other existing speed models. Finally, the proposed models were utilized to evaluate the effect of traffic volume and its composition on speed.Keywords: speed, Kriging, arterial, traffic volume
Procedia PDF Downloads 3507152 PM10 Chemical Characteristics in a Background Site at the Universidad Libre Bogotá
Authors: Laura X. Martinez, Andrés F. Rodríguez, Ruth A. Catacoli
Abstract:
One of the most important factors for air pollution is that the concentrations of PM10 maintain a constant trend, with the exception of some places where that frequently surpasses the allowed ranges established by Colombian legislation. The community that surrounds the Universidad Libre Bogotá is inhabited by a considerable number of students and workers, all of whom are possibly being exposed to PM10 for long periods of time while on campus. Thus, the chemical characterization of PM10 found in the ambient air at the Universidad Libre Bogotá was identified as a problem. A Hi-Vol sampler and EPA Test Method 5 were used to determine if the quality of air is adequate for the human respiratory system. Additionally, quartz fiber filters were utilized during sampling. Samples were taken three days a week during a dry period throughout the months of November and December 2015. The gravimetric analysis method was used to determine PM10 concentrations. The chemical characterization includes non-conventional carcinogenic pollutants. Atomic absorption spectrophotometry (AAS) was used for the determination of metals and VOCs were analyzed using the FTIR (Fourier transform infrared spectroscopy) method. In this way, concentrations of PM10, ranging from values of 13 µg/m3 to 66 µg/m3, were obtained; these values were below standard conditions. This evidence concludes that the PM10 concentrations during an exposure period of 24 hours are lower than the values established by Colombian law, Resolution 610 of 2010; however, when comparing these with the limits set by the World Health Organization (WHO), these concentrations could possibly exceed permissible levels.Keywords: air quality, atomic absorption spectrophotometry, gas chromatography, particulate matter
Procedia PDF Downloads 2547151 Artificial Intelligence for Generative Modelling
Authors: Shryas Bhurat, Aryan Vashistha, Sampreet Dinakar Nayak, Ayush Gupta
Abstract:
As the technology is advancing more towards high computational resources, there is a paradigm shift in the usage of these resources to optimize the design process. This paper discusses the usage of ‘Generative Design using Artificial Intelligence’ to build better models that adapt the operations like selection, mutation, and crossover to generate results. The human mind thinks of the simplest approach while designing an object, but the intelligence learns from the past & designs the complex optimized CAD Models. Generative Design takes the boundary conditions and comes up with multiple solutions with iterations to come up with a sturdy design with the most optimal parameter that is given, saving huge amounts of time & resources. The new production techniques that are at our disposal allow us to use additive manufacturing, 3D printing, and other innovative manufacturing techniques to save resources and design artistically engineered CAD Models. Also, this paper discusses the Genetic Algorithm, the Non-Domination technique to choose the right results using biomimicry that has evolved for current habitation for millions of years. The computer uses parametric models to generate newer models using an iterative approach & uses cloud computing to store these iterative designs. The later part of the paper compares the topology optimization technology with Generative Design that is previously being used to generate CAD Models. Finally, this paper shows the performance of algorithms and how these algorithms help in designing resource-efficient models.Keywords: genetic algorithm, bio mimicry, generative modeling, non-dominant techniques
Procedia PDF Downloads 1477150 Mixed Effects Models for Short-Term Load Forecasting for the Spanish Regions: Castilla-Leon, Castilla-La Mancha and Andalucia
Authors: C. Senabre, S. Valero, M. Lopez, E. Velasco, M. Sanchez
Abstract:
This paper focuses on an application of linear mixed models to short-term load forecasting. The challenge of this research is to improve a currently working model at the Spanish Transport System Operator, programmed by us, and based on linear autoregressive techniques and neural networks. The forecasting system currently forecasts each of the regions within the Spanish grid separately, even though the behavior of the load in each region is affected by the same factors in a similar way. A load forecasting system has been verified in this work by using the real data from a utility. In this research it has been used an integration of several regions into a linear mixed model as starting point to obtain the information from other regions. Firstly, the systems to learn general behaviors present in all regions, and secondly, it is identified individual deviation in each regions. The technique can be especially useful when modeling the effect of special days with scarce information from the past. The three most relevant regions of the system have been used to test the model, focusing on special day and improving the performance of both currently working models used as benchmark. A range of comparisons with different forecasting models has been conducted. The forecasting results demonstrate the superiority of the proposed methodology.Keywords: short-term load forecasting, mixed effects models, neural networks, mixed effects models
Procedia PDF Downloads 1887149 Predominance of Teaching Models Used by Math Teachers in Secondary Education
Authors: Verónica Diaz Quezada
Abstract:
This research examines the teaching models used by secondary math teachers when teaching logarithmic, quadratic and exponential functions. For this, descriptive case studies have been carried out on 5 secondary teachers. These teachers have been chosen from 3 scientific-humanistic and technical schools, in Chile. Data have been obtained through non-participant class observation and the application of a questionnaire and a rubric to teachers. According to the results, the didactic model that prevails is the one that starts with an interactive strategy, moves to a more content-based structure, and ends with a reinforcement stage. Nonetheless, there is always influence from teachers, their methods, and the group of students.Keywords: teaching models, math teachers, functions, secondary education
Procedia PDF Downloads 1877148 A Super-Efficiency Model for Evaluating Efficiency in the Presence of Time Lag Effect
Authors: Yanshuang Zhang, Byungho Jeong
Abstract:
In many cases, there is a time lag between the consumption of inputs and the production of outputs. This time lag effect should be considered in evaluating the performance of organizations. Recently, a couple of DEA models were developed for considering time lag effect in efficiency evaluation of research activities. Multi-periods input(MpI) and Multi-periods output(MpO) models are integrated models to calculate simple efficiency considering time lag effect. However, these models can’t discriminate efficient DMUs because of the nature of basic DEA model in which efficiency scores are limited to ‘1’. That is, efficient DMUs can’t be discriminated because their efficiency scores are same. Thus, this paper suggests a super-efficiency model for efficiency evaluation under the consideration of time lag effect based on the MpO model. A case example using a long-term research project is given to compare the suggested model with the MpO model.Keywords: DEA, super-efficiency, time lag, multi-periods input
Procedia PDF Downloads 4707147 Exploring Tweet Geolocation: Leveraging Large Language Models for Post-Hoc Explanations
Authors: Sarra Hasni, Sami Faiz
Abstract:
In recent years, location prediction on social networks has gained significant attention, with short and unstructured texts like tweets posing additional challenges. Advanced geolocation models have been proposed, increasing the need to explain their predictions. In this paper, we provide explanations for a geolocation black-box model using LIME and SHAP, two state-of-the-art XAI (eXplainable Artificial Intelligence) methods. We extend our evaluations to Large Language Models (LLMs) as post hoc explainers for tweet geolocation. Our preliminary results show that LLMs outperform LIME and SHAP by generating more accurate explanations. Additionally, we demonstrate that prompts with examples and meta-prompts containing phonetic spelling rules improve the interpretability of these models, even with informal input data. This approach highlights the potential of advanced prompt engineering techniques to enhance the effectiveness of black-box models in geolocation tasks on social networks.Keywords: large language model, post hoc explainer, prompt engineering, local explanation, tweet geolocation
Procedia PDF Downloads 247146 Classification of Business Models of Italian Bancassurance by Balance Sheet Indicators
Authors: Andrea Bellucci, Martina Tofi
Abstract:
The aim of paper is to analyze business models of bancassurance in Italy for life business. The life insurance business is very developed in the Italian market and banks branches have 80% of the market share. Given its maturity, the life insurance market needs to consolidate its organizational form to allow for the development of non-life business, which nowadays collects few premiums but represents a great opportunity to enlarge the market share of bancassurance using its strength in the distribution channel while the market share of independent agents is decreasing. Starting with the main business model of bancassurance for life business, this paper will analyze the performances of life companies in the Italian market by balance sheet indicators and by main discriminant variables of business models. The study will observe trends from 2013 to 2015 for the Italian market by exploiting a database managed by Associazione Nazionale delle Imprese di Assicurazione (ANIA). The applied approach is based on a bottom-up analysis starting with variables and indicators to define business models’ classification. The statistical classification algorithm proposed by Ward is employed to design business models’ profiles. Results from the analysis will be a representation of the main business models built by their profile related to indicators. In that way, an unsupervised analysis is developed that has the limit of its judgmental dimension based on research opinion, but it is possible to obtain a design of effective business models.Keywords: bancassurance, business model, non life bancassurance, insurance business value drivers
Procedia PDF Downloads 2967145 Reducing Uncertainty in Climate Projections over Uganda by Numerical Models Using Bias Correction
Authors: Isaac Mugume
Abstract:
Since the beginning of the 21st century, climate change has been an issue due to the reported rise in global temperature and changes in the frequency as well as severity of extreme weather and climatic events. The changing climate has been attributed to rising concentrations of greenhouse gases, including environmental changes such as ecosystems and land-uses. Climatic projections have been carried out under the auspices of the intergovernmental panel on climate change where a couple of models have been run to inform us about the likelihood of future climates. Since one of the major forcings informing the changing climate is emission of greenhouse gases, different scenarios have been proposed and future climates for different periods presented. The global climate models project different areas to experience different impacts. While regional modeling is being carried out for high impact studies, bias correction is less documented. Yet, the regional climate models suffer bias which introduces uncertainty. This is addressed in this study by bias correcting the regional models. This study uses the Weather Research and Forecasting model under different representative concentration pathways and correcting the products of these models using observed climatic data. This study notes that bias correction (e.g., the running-mean bias correction; the best easy systematic estimator method; the simple linear regression method, nearest neighborhood, weighted mean) improves the climatic projection skill and therefore reduce the uncertainty inherent in the climatic projections.Keywords: bias correction, climatic projections, numerical models, representative concentration pathways
Procedia PDF Downloads 1177144 A Nonlinear Dynamical System with Application
Authors: Abdullah Eqal Al Mazrooei
Abstract:
In this paper, a nonlinear dynamical system is presented. This system is a bilinear class. The bilinear systems are very important kind of nonlinear systems because they have many applications in real life. They are used in biology, chemistry, manufacturing, engineering, and economics where linear models are ineffective or inadequate. They have also been recently used to analyze and forecast weather conditions. Bilinear systems have three advantages: First, they define many problems which have a great applied importance. Second, they give us approximations to nonlinear systems. Thirdly, they have a rich geometric and algebraic structures, which promises to be a fruitful field of research for scientists and applications. The type of nonlinearity that is treated and analyzed consists of bilinear interaction between the states vectors and the system input. By using some properties of the tensor product, these systems can be transformed to linear systems. But, here we discuss the nonlinearity when the state vector is multiplied by itself. So, this model will be able to handle evolutions according to the Lotka-Volterra models or the Lorenz weather models, thus enabling a wider and more flexible application of such models. Here we apply by using an estimator to estimate temperatures. The results prove the efficiency of the proposed system.Keywords: Lorenz models, nonlinear systems, nonlinear estimator, state-space model
Procedia PDF Downloads 2527143 Models of State Organization and Influence over Collective Identity and Nationalism in Spain
Authors: Muñoz-Sanchez, Victor Manuel, Perez-Flores, Antonio Manuel
Abstract:
The main objective of this paper is to establish the relationship between models of state organization and the various types of collective identity expressed by the Spanish. The question of nationalism and identity ascription in Spain has always been a topic of special importance due to the presence in that country of territories where the population emits very different opinions of nationalist sentiment than the rest of Spain. The current situation of sovereignty challenge of Catalonia to the central government exemplifies the importance of the subject matter. In order to analyze this process of interrelation, we use a secondary data mining by applying the multiple correspondence analysis technique (MCA). As a main result a typology of four types of expression of collective identity based on models of State organization are shown, which are connected with the party position on this issue.Keywords: models of organization of the state, nationalism, collective identity, Spain, political parties
Procedia PDF Downloads 4417142 Participation in IAEA Proficiency Test to Analyse Cobalt, Strontium and Caesium in Seawater Using Direct Counting and Radiochemical Techniques
Authors: S. Visetpotjanakit, C. Khrautongkieo
Abstract:
Radiation monitoring in the environment and foodstuffs is one of the main responsibilities of Office of Atoms for Peace (OAP) as the nuclear regulatory body of Thailand. The main goal of the OAP is to assure the safety of the Thai people and environment from any radiological incidents. Various radioanalytical methods have been developed to monitor radiation and radionuclides in the environmental and foodstuff samples. To validate our analytical performance, several proficiency test exercises from the International Atomic Energy Agency (IAEA) have been performed. Here, the results of a proficiency test exercise referred to as the Proficiency Test for Tritium, Cobalt, Strontium and Caesium Isotopes in Seawater 2017 (IAEA-RML-2017-01) are presented. All radionuclides excepting ³H were analysed using various radioanalytical methods, i.e. direct gamma-ray counting for determining ⁶⁰Co, ¹³⁴Cs and ¹³⁷Cs and developed radiochemical techniques for analysing ¹³⁴Cs, ¹³⁷Cs using AMP pre-concentration technique and 90Sr using di-(2-ethylhexyl) phosphoric acid (HDEHP) liquid extraction technique. The analysis results were submitted to IAEA. All results passed IAEA criteria, i.e. accuracy, precision and trueness and obtained ‘Accepted’ statuses. These confirm the data quality from the OAP environmental radiation laboratory to monitor radiation in the environment.Keywords: international atomic energy agency, proficiency test, radiation monitoring, seawater
Procedia PDF Downloads 1697141 Mosaic Augmentation: Insights and Limitations
Authors: Olivia A. Kjorlien, Maryam Asghari, Farshid Alizadeh-Shabdiz
Abstract:
The goal of this paper is to investigate the impact of mosaic augmentation on the performance of object detection solutions. To carry out the study, YOLOv4 and YOLOv4-Tiny models have been selected, which are popular, advanced object detection models. These models are also representatives of two classes of complex and simple models. The study also has been carried out on two categories of objects, simple and complex. For this study, YOLOv4 and YOLOv4 Tiny are trained with and without mosaic augmentation for two sets of objects. While mosaic augmentation improves the performance of simple object detection, it deteriorates the performance of complex object detection, specifically having the largest negative impact on the false positive rate in a complex object detection case.Keywords: accuracy, false positives, mosaic augmentation, object detection, YOLOV4, YOLOV4-Tiny
Procedia PDF Downloads 1257140 Investigation of Different Control Stratgies for UPFC Decoupled Model and the Impact of Location on Control Parameters
Authors: S. A. Al-Qallaf, S. A. Al-Mawsawi, A. Haider
Abstract:
In order to evaluate the performance of a unified power flow controller (UPFC), mathematical models for steady state and dynamic analysis are to be developed. The steady state model is mainly concerned with the incorporation of the UPFC in load flow studies. Several load flow models for UPFC have been introduced in literature, and one of the most reliable models is the decoupled UPFC model. In spite of UPFC decoupled load flow model simplicity, it is more robust compared to other UPFC load flow models and it contains unique capabilities. Some shortcoming such as additional set of nonlinear equations are to be solved separately after the load flow solution is obtained. The aim of this study is to investigate the different control strategies that can be realized in the decoupled load flow model (individual control and combined control), and the impact of the location of the UPFC in the network on its control parameters.Keywords: UPFC, decoupled model, load flow, control parameters
Procedia PDF Downloads 551