Search results for: fuzzy set Models
6665 Exploring Tweet Geolocation: Leveraging Large Language Models for Post-Hoc Explanations
Authors: Sarra Hasni, Sami Faiz
Abstract:
In recent years, location prediction on social networks has gained significant attention, with short and unstructured texts like tweets posing additional challenges. Advanced geolocation models have been proposed, increasing the need to explain their predictions. In this paper, we provide explanations for a geolocation black-box model using LIME and SHAP, two state-of-the-art XAI (eXplainable Artificial Intelligence) methods. We extend our evaluations to Large Language Models (LLMs) as post hoc explainers for tweet geolocation. Our preliminary results show that LLMs outperform LIME and SHAP by generating more accurate explanations. Additionally, we demonstrate that prompts with examples and meta-prompts containing phonetic spelling rules improve the interpretability of these models, even with informal input data. This approach highlights the potential of advanced prompt engineering techniques to enhance the effectiveness of black-box models in geolocation tasks on social networks.Keywords: large language model, post hoc explainer, prompt engineering, local explanation, tweet geolocation
Procedia PDF Downloads 256664 Material Handling Equipment Selection Using Fuzzy AHP Approach
Authors: Priyanka Verma, Vijaya Dixit, Rishabh Bajpai
Abstract:
This research paper is aimed at selecting appropriate material handling equipment among the given choices so that the automation level in material handling can be enhanced. This work is a practical case scenario of material handling systems in consumer electronic appliances manufacturing organization. The choices of material handling equipment among which the decision has to be made are Automated Guided Vehicle’s (AGV), Autonomous Mobile Robots (AMR), Overhead Conveyer’s (OC) and Battery Operated Trucks/Vehicle’s (BOT). There is a need of attaining a certain level of automation in order to reduce human interventions in the organization. This requirement of achieving certain degree of automation can be attained by material handling equipment’s mentioned above. The main motive for selecting above equipment’s for study was solely based on corporate financial strategy of investment and return obtained through that investment made in stipulated time framework. Since the low cost automation with respect to material handling devices has to be achieved hence these equipment’s were selected. Investment to be done on each unit of this equipment is less than 20 lakh rupees (INR) and the recovery period is less than that of five years. Fuzzy analytic hierarchic process (FAHP) is applied here for selecting equipment where the four choices are evaluated on basis of four major criteria’s and 13 sub criteria’s, and are prioritized on the basis of weight obtained. The FAHP used here make use of triangular fuzzy numbers (TFN). The inability of the traditional AHP in order to deal with the subjectiveness and impreciseness in the pair-wise comparison process has been improved in the FAHP. The range of values for general rating purposes for all decision making parameters is kept between 0 and 1 on the basis of expert opinions captured on shop floor. These experts were familiar with operating environment and shop floor activity control. Instead of generating exact value the FAHP generates the ranges of values to accommodate the uncertainty in decision-making process. The four major criteria’s selected for the evaluation of choices of material handling equipment’s available are materials, technical capabilities, cost and other features. The thirteen sub criteria’s listed under these following four major criteria’s are weighing capacity, load per hour, material compatibility, capital cost, operating cost and maintenance cost, speed, distance moved, space required, frequency of trips, control required, safety and reliability issues. The key finding shows that among the four major criteria selected, cost is emerged as the most important criteria and is one of the key decision making aspect on the basis of which material equipment selection is based on. While further evaluating the choices of equipment available for each sub criteria it is found that AGV scores the highest weight in most of the sub-criteria’s. On carrying out complete analysis the research shows that AGV is the best material handling equipment suiting all decision criteria’s selected in FAHP and therefore it is beneficial for the organization to carry out automated material handling in the facility using AGV’s.Keywords: fuzzy analytic hierarchy process (FAHP), material handling equipment, subjectiveness, triangular fuzzy number (TFN)
Procedia PDF Downloads 4346663 Classification of Business Models of Italian Bancassurance by Balance Sheet Indicators
Authors: Andrea Bellucci, Martina Tofi
Abstract:
The aim of paper is to analyze business models of bancassurance in Italy for life business. The life insurance business is very developed in the Italian market and banks branches have 80% of the market share. Given its maturity, the life insurance market needs to consolidate its organizational form to allow for the development of non-life business, which nowadays collects few premiums but represents a great opportunity to enlarge the market share of bancassurance using its strength in the distribution channel while the market share of independent agents is decreasing. Starting with the main business model of bancassurance for life business, this paper will analyze the performances of life companies in the Italian market by balance sheet indicators and by main discriminant variables of business models. The study will observe trends from 2013 to 2015 for the Italian market by exploiting a database managed by Associazione Nazionale delle Imprese di Assicurazione (ANIA). The applied approach is based on a bottom-up analysis starting with variables and indicators to define business models’ classification. The statistical classification algorithm proposed by Ward is employed to design business models’ profiles. Results from the analysis will be a representation of the main business models built by their profile related to indicators. In that way, an unsupervised analysis is developed that has the limit of its judgmental dimension based on research opinion, but it is possible to obtain a design of effective business models.Keywords: bancassurance, business model, non life bancassurance, insurance business value drivers
Procedia PDF Downloads 2986662 Reducing Uncertainty in Climate Projections over Uganda by Numerical Models Using Bias Correction
Authors: Isaac Mugume
Abstract:
Since the beginning of the 21st century, climate change has been an issue due to the reported rise in global temperature and changes in the frequency as well as severity of extreme weather and climatic events. The changing climate has been attributed to rising concentrations of greenhouse gases, including environmental changes such as ecosystems and land-uses. Climatic projections have been carried out under the auspices of the intergovernmental panel on climate change where a couple of models have been run to inform us about the likelihood of future climates. Since one of the major forcings informing the changing climate is emission of greenhouse gases, different scenarios have been proposed and future climates for different periods presented. The global climate models project different areas to experience different impacts. While regional modeling is being carried out for high impact studies, bias correction is less documented. Yet, the regional climate models suffer bias which introduces uncertainty. This is addressed in this study by bias correcting the regional models. This study uses the Weather Research and Forecasting model under different representative concentration pathways and correcting the products of these models using observed climatic data. This study notes that bias correction (e.g., the running-mean bias correction; the best easy systematic estimator method; the simple linear regression method, nearest neighborhood, weighted mean) improves the climatic projection skill and therefore reduce the uncertainty inherent in the climatic projections.Keywords: bias correction, climatic projections, numerical models, representative concentration pathways
Procedia PDF Downloads 1196661 A Nonlinear Dynamical System with Application
Authors: Abdullah Eqal Al Mazrooei
Abstract:
In this paper, a nonlinear dynamical system is presented. This system is a bilinear class. The bilinear systems are very important kind of nonlinear systems because they have many applications in real life. They are used in biology, chemistry, manufacturing, engineering, and economics where linear models are ineffective or inadequate. They have also been recently used to analyze and forecast weather conditions. Bilinear systems have three advantages: First, they define many problems which have a great applied importance. Second, they give us approximations to nonlinear systems. Thirdly, they have a rich geometric and algebraic structures, which promises to be a fruitful field of research for scientists and applications. The type of nonlinearity that is treated and analyzed consists of bilinear interaction between the states vectors and the system input. By using some properties of the tensor product, these systems can be transformed to linear systems. But, here we discuss the nonlinearity when the state vector is multiplied by itself. So, this model will be able to handle evolutions according to the Lotka-Volterra models or the Lorenz weather models, thus enabling a wider and more flexible application of such models. Here we apply by using an estimator to estimate temperatures. The results prove the efficiency of the proposed system.Keywords: Lorenz models, nonlinear systems, nonlinear estimator, state-space model
Procedia PDF Downloads 2546660 Models of State Organization and Influence over Collective Identity and Nationalism in Spain
Authors: Muñoz-Sanchez, Victor Manuel, Perez-Flores, Antonio Manuel
Abstract:
The main objective of this paper is to establish the relationship between models of state organization and the various types of collective identity expressed by the Spanish. The question of nationalism and identity ascription in Spain has always been a topic of special importance due to the presence in that country of territories where the population emits very different opinions of nationalist sentiment than the rest of Spain. The current situation of sovereignty challenge of Catalonia to the central government exemplifies the importance of the subject matter. In order to analyze this process of interrelation, we use a secondary data mining by applying the multiple correspondence analysis technique (MCA). As a main result a typology of four types of expression of collective identity based on models of State organization are shown, which are connected with the party position on this issue.Keywords: models of organization of the state, nationalism, collective identity, Spain, political parties
Procedia PDF Downloads 4436659 Mosaic Augmentation: Insights and Limitations
Authors: Olivia A. Kjorlien, Maryam Asghari, Farshid Alizadeh-Shabdiz
Abstract:
The goal of this paper is to investigate the impact of mosaic augmentation on the performance of object detection solutions. To carry out the study, YOLOv4 and YOLOv4-Tiny models have been selected, which are popular, advanced object detection models. These models are also representatives of two classes of complex and simple models. The study also has been carried out on two categories of objects, simple and complex. For this study, YOLOv4 and YOLOv4 Tiny are trained with and without mosaic augmentation for two sets of objects. While mosaic augmentation improves the performance of simple object detection, it deteriorates the performance of complex object detection, specifically having the largest negative impact on the false positive rate in a complex object detection case.Keywords: accuracy, false positives, mosaic augmentation, object detection, YOLOV4, YOLOV4-Tiny
Procedia PDF Downloads 1276658 Investigation of Different Control Stratgies for UPFC Decoupled Model and the Impact of Location on Control Parameters
Authors: S. A. Al-Qallaf, S. A. Al-Mawsawi, A. Haider
Abstract:
In order to evaluate the performance of a unified power flow controller (UPFC), mathematical models for steady state and dynamic analysis are to be developed. The steady state model is mainly concerned with the incorporation of the UPFC in load flow studies. Several load flow models for UPFC have been introduced in literature, and one of the most reliable models is the decoupled UPFC model. In spite of UPFC decoupled load flow model simplicity, it is more robust compared to other UPFC load flow models and it contains unique capabilities. Some shortcoming such as additional set of nonlinear equations are to be solved separately after the load flow solution is obtained. The aim of this study is to investigate the different control strategies that can be realized in the decoupled load flow model (individual control and combined control), and the impact of the location of the UPFC in the network on its control parameters.Keywords: UPFC, decoupled model, load flow, control parameters
Procedia PDF Downloads 5556657 A Study on Characteristics of Hedonic Price Models in Korea Based on Meta-Regression Analysis
Authors: Minseo Jo
Abstract:
The purpose of this paper is to examine the factors in the hedonic price models, that has significance impact in determining the price of apartments. There are many variables employed in the hedonic price models and their effectiveness vary differently according to the researchers and the regions they are analysing. In order to consider various conditions, the meta-regression analysis has been selected for the study. In this paper, four meta-independent variables, from the 65 hedonic price models to analysis. The factors that influence the prices of apartments, as well as including factors that influence the prices of apartments, regions, which are divided into two of the research performed, years of research performed, the coefficients of the functions employed. The covariance between the four meta-variables and p-value of the coefficients and the four meta-variables and number of data used in the 65 hedonic price models have been analyzed in this study. The six factors that are most important in deciding the prices of apartments are positioning of apartments, the noise of the apartments, points of the compass and views from the apartments, proximity to the public transportations, companies that have constructed the apartments, social environments (such as schools etc.).Keywords: hedonic price model, housing price, meta-regression analysis, characteristics
Procedia PDF Downloads 4026656 Modified Fuzzy Delphi Method to Incorporate Healthcare Stakeholders’ Perspectives in Selecting Quality Improvement Projects’ Criteria
Authors: Alia Aldarmaki, Ahmad Elshennawy
Abstract:
There is a global shift in healthcare systems’ emphasizing engaging different stakeholders in selecting quality improvement initiatives and incorporating their preferences to improve the healthcare efficiency and outcomes. Although experts bring scientific knowledge based on the scientific model and their personal experience, other stakeholders can bring new insights and information into the decision-making process. This study attempts to explore the impact of incorporating different stakeholders’ preference in identifying the most significant criteria that should be considered in healthcare for electing the improvement projects. A Framework based on a modified Fuzzy Delphi Method (FDM) was built. In addition to, the subject matter experts, doctors/physicians, nurses, administrators, and managers groups contribute to the selection process. The research identifies potential criteria for evaluating projects in healthcare, then utilizes FDM to capture expertise knowledge. The first round in FDM is intended to validate the identified list of criteria from experts; which includes collecting additional criteria from experts that the literature might have overlooked. When an acceptable level of consensus has been reached, a second round is conducted to obtain experts’ and other related stakeholders’ opinions on the appropriate weight of each criterion’s importance using linguistic variables. FDM analyses eliminate or retain the criteria to produce a final list of the critical criteria to select improvement projects in healthcare. Finally, reliability and validity were investigated using Cronbach’s alpha and factor analysis, respectively. Two case studies were carried out in a public hospital in the United Arab Emirates to test the framework. Both cases demonstrate that even though there were common criteria between the experts and the stakeholders, still stakeholders’ perceptions bring additional critical criteria into the evaluation process, which can impact the outcomes. Experts selected criteria related to strategical and managerial aspects, while the other participants preferred criteria related to social aspects such as health and safety and patients’ satisfaction. The health and safety criterion had the highest important weight in both cases. The analysis showed that Cronbach’s alpha value is 0.977 and all criteria have factor loading greater than 0.3. In conclusion, the inclusion of stakeholders’ perspectives is intended to enhance stakeholders’ engagement, improve transparency throughout the decision process, and take robust decisions.Keywords: Fuzzy Delphi Method, fuzzy number, healthcare, stakeholders
Procedia PDF Downloads 1286655 The Use of Geographic Information System for Selecting Landfill Sites in Osogbo
Authors: Nureni Amoo, Sunday Aroge, Oluranti Akintola, Hakeem Olujide, Ibrahim Alabi
Abstract:
This study investigated the optimum landfill site in Osogbo so as to identify suitable solid waste dumpsite for proper waste management in the capital city. Despite an increase in alternative techniques for disposing of waste, landfilling remains the primary means of waste disposal. These changes in attitudes in many parts of the world have been supported by changes in laws and policies regarding the environment and waste disposal. Selecting the most suitable site for landfill can avoid any ecological and socio-economic effects. The increase in industrial and economic development, along with the increase of population growth in Osogbo town, generates a tremendous amount of solid waste within the region. Factors such as the scarcity of land, the lifespan of the landfill, and environmental considerations warrant that the scientific and fundamental studies are carried out in determining the suitability of a landfill site. The analysis of spatial data and consideration of regulations and accepted criteria are part of the important elements in the site selection. This paper presents a multi-criteria decision-making method using geographic information system (GIS) with the integration of the fuzzy logic multi-criteria decision making (FMCDM) technique for landfill suitability site evaluation. By using the fuzzy logic method (classification of suitable areas in the range of 0 to 1 scale), the superposing of the information layers related to drainage, soil, land use/land cover, slope, land use, and geology maps were performed in the study. Based on the result obtained in this study, five (5) potential sites are suitable for the construction of a landfill are proposed, two of which belong to the most suitable zone, and the existing waste disposal site belonged to the unsuitable zone.Keywords: fuzzy logic multi-criteria decision making, geographic information system, landfill, suitable site, waste disposal
Procedia PDF Downloads 1426654 Convolutional Neural Networks Architecture Analysis for Image Captioning
Authors: Jun Seung Woo, Shin Dong Ho
Abstract:
The Image Captioning models with Attention technology have developed significantly compared to previous models, but it is still unsatisfactory in recognizing images. We perform an extensive search over seven interesting Convolutional Neural Networks(CNN) architectures to analyze the behavior of different models for image captioning. We compared seven different CNN Architectures, according to batch size, using on public benchmarks: MS-COCO datasets. In our experimental results, DenseNet and InceptionV3 got about 14% loss and about 160sec training time per epoch. It was the most satisfactory result among the seven CNN architectures after training 50 epochs on GPU.Keywords: deep learning, image captioning, CNN architectures, densenet, inceptionV3
Procedia PDF Downloads 1326653 Models and Metamodels for Computer-Assisted Natural Language Grammar Learning
Authors: Evgeny Pyshkin, Maxim Mozgovoy, Vladislav Volkov
Abstract:
The paper follows a discourse on computer-assisted language learning. We examine problems of foreign language teaching and learning and introduce a metamodel that can be used to define learning models of language grammar structures in order to support teacher/student interaction. Special attention is paid to the concept of a virtual language lab. Our approach to language education assumes to encourage learners to experiment with a language and to learn by discovering patterns of grammatically correct structures created and managed by a language expert.Keywords: computer-assisted instruction, language learning, natural language grammar models, HCI
Procedia PDF Downloads 5196652 Automatic Calibration of Agent-Based Models Using Deep Neural Networks
Authors: Sima Najafzadehkhoei, George Vega Yon
Abstract:
This paper presents an approach for calibrating Agent-Based Models (ABMs) efficiently, utilizing Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) networks. These machine learning techniques are applied to Susceptible-Infected-Recovered (SIR) models, which are a core framework in the study of epidemiology. Our method replicates parameter values from observed trajectory curves, enhancing the accuracy of predictions when compared to traditional calibration techniques. Through the use of simulated data, we train the models to predict epidemiological parameters more accurately. Two primary approaches were explored: one where the number of susceptible, infected, and recovered individuals is fully known, and another using only the number of infected individuals. Our method shows promise for application in other ABMs where calibration is computationally intensive and expensive.Keywords: ABM, calibration, CNN, LSTM, epidemiology
Procedia PDF Downloads 246651 Brain Tumor Detection and Classification Using Pre-Trained Deep Learning Models
Authors: Aditya Karade, Sharada Falane, Dhananjay Deshmukh, Vijaykumar Mantri
Abstract:
Brain tumors pose a significant challenge in healthcare due to their complex nature and impact on patient outcomes. The application of deep learning (DL) algorithms in medical imaging have shown promise in accurate and efficient brain tumour detection. This paper explores the performance of various pre-trained DL models ResNet50, Xception, InceptionV3, EfficientNetB0, DenseNet121, NASNetMobile, VGG19, VGG16, and MobileNet on a brain tumour dataset sourced from Figshare. The dataset consists of MRI scans categorizing different types of brain tumours, including meningioma, pituitary, glioma, and no tumour. The study involves a comprehensive evaluation of these models’ accuracy and effectiveness in classifying brain tumour images. Data preprocessing, augmentation, and finetuning techniques are employed to optimize model performance. Among the evaluated deep learning models for brain tumour detection, ResNet50 emerges as the top performer with an accuracy of 98.86%. Following closely is Xception, exhibiting a strong accuracy of 97.33%. These models showcase robust capabilities in accurately classifying brain tumour images. On the other end of the spectrum, VGG16 trails with the lowest accuracy at 89.02%.Keywords: brain tumour, MRI image, detecting and classifying tumour, pre-trained models, transfer learning, image segmentation, data augmentation
Procedia PDF Downloads 746650 Continuum-Based Modelling Approaches for Cell Mechanics
Authors: Yogesh D. Bansod, Jiri Bursa
Abstract:
The quantitative study of cell mechanics is of paramount interest since it regulates the behavior of the living cells in response to the myriad of extracellular and intracellular mechanical stimuli. The novel experimental techniques together with robust computational approaches have given rise to new theories and models, which describe cell mechanics as a combination of biomechanical and biochemical processes. This review paper encapsulates the existing continuum-based computational approaches that have been developed for interpreting the mechanical responses of living cells under different loading and boundary conditions. The salient features and drawbacks of each model are discussed from both structural and biological points of view. This discussion can contribute to the development of even more precise and realistic computational models of cell mechanics based on continuum approaches or on their combination with microstructural approaches, which in turn may provide a better understanding of mechanotransduction in living cells.Keywords: cell mechanics, computational models, continuum approach, mechanical models
Procedia PDF Downloads 3636649 Evaluation and Compression of Different Language Transformer Models for Semantic Textual Similarity Binary Task Using Minority Language Resources
Authors: Ma. Gracia Corazon Cayanan, Kai Yuen Cheong, Li Sha
Abstract:
Training a language model for a minority language has been a challenging task. The lack of available corpora to train and fine-tune state-of-the-art language models is still a challenge in the area of Natural Language Processing (NLP). Moreover, the need for high computational resources and bulk data limit the attainment of this task. In this paper, we presented the following contributions: (1) we introduce and used a translation pair set of Tagalog and English (TL-EN) in pre-training a language model to a minority language resource; (2) we fine-tuned and evaluated top-ranking and pre-trained semantic textual similarity binary task (STSB) models, to both TL-EN and STS dataset pairs. (3) then, we reduced the size of the model to offset the need for high computational resources. Based on our results, the models that were pre-trained to translation pairs and STS pairs can perform well for STSB task. Also, having it reduced to a smaller dimension has no negative effect on the performance but rather has a notable increase on the similarity scores. Moreover, models that were pre-trained to a similar dataset have a tremendous effect on the model’s performance scores.Keywords: semantic matching, semantic textual similarity binary task, low resource minority language, fine-tuning, dimension reduction, transformer models
Procedia PDF Downloads 2116648 A Comparative Analysis of ARIMA and Threshold Autoregressive Models on Exchange Rate
Authors: Diteboho Xaba, Kolentino Mpeta, Tlotliso Qejoe
Abstract:
This paper assesses the in-sample forecasting of the South African exchange rates comparing a linear ARIMA model and a SETAR model. The study uses a monthly adjusted data of South African exchange rates with 420 observations. Akaike information criterion (AIC) and the Schwarz information criteria (SIC) are used for model selection. Mean absolute error (MAE), root mean squared error (RMSE) and mean absolute percentage error (MAPE) are error metrics used to evaluate forecast capability of the models. The Diebold –Mariano (DM) test is employed in the study to check forecast accuracy in order to distinguish the forecasting performance between the two models (ARIMA and SETAR). The results indicate that both models perform well when modelling and forecasting the exchange rates, but SETAR seemed to outperform ARIMA.Keywords: ARIMA, error metrices, model selection, SETAR
Procedia PDF Downloads 2446647 A Trend Based Forecasting Framework of the ATA Method and Its Performance on the M3-Competition Data
Authors: H. Taylan Selamlar, I. Yavuz, G. Yapar
Abstract:
It is difficult to make predictions especially about the future and making accurate predictions is not always easy. However, better predictions remain the foundation of all science therefore the development of accurate, robust and reliable forecasting methods is very important. Numerous number of forecasting methods have been proposed and studied in the literature. There are still two dominant major forecasting methods: Box-Jenkins ARIMA and Exponential Smoothing (ES), and still new methods are derived or inspired from them. After more than 50 years of widespread use, exponential smoothing is still one of the most practically relevant forecasting methods available due to their simplicity, robustness and accuracy as automatic forecasting procedures especially in the famous M-Competitions. Despite its success and widespread use in many areas, ES models have some shortcomings that negatively affect the accuracy of forecasts. Therefore, a new forecasting method in this study will be proposed to cope with these shortcomings and it will be called ATA method. This new method is obtained from traditional ES models by modifying the smoothing parameters therefore both methods have similar structural forms and ATA can be easily adapted to all of the individual ES models however ATA has many advantages due to its innovative new weighting scheme. In this paper, the focus is on modeling the trend component and handling seasonality patterns by utilizing classical decomposition. Therefore, ATA method is expanded to higher order ES methods for additive, multiplicative, additive damped and multiplicative damped trend components. The proposed models are called ATA trended models and their predictive performances are compared to their counter ES models on the M3 competition data set since it is still the most recent and comprehensive time-series data collection available. It is shown that the models outperform their counters on almost all settings and when a model selection is carried out amongst these trended models ATA outperforms all of the competitors in the M3- competition for both short term and long term forecasting horizons when the models’ forecasting accuracies are compared based on popular error metrics.Keywords: accuracy, exponential smoothing, forecasting, initial value
Procedia PDF Downloads 1776646 DGA Data Interpretation Using Extension Theory for Power Transformer Diagnostics
Authors: O. P. Rahi, Manoj Kumar
Abstract:
Power transformers are essential and expensive equipments in electrical power system. Dissolved gas analysis (DGA) is one of the most useful techniques to detect incipient faults in power transformers. However, the identification of the faulted location by conventional method is not always an easy task due to variability of gas data and operational variables. In this paper, an extension theory based power transformer fault diagnosis method is presented. Extension theory tries to solve contradictions and incompatibility problems. This paper first briefly introduces the basic concept of matter element theory, establishes the matter element models for three-ratio method, and then briefly discusses extension set theory. Detailed analysis is carried out on the extended relation function (ERF) adopted in this paper for transformer fault diagnosis. The detailed diagnosing steps are offered. Simulation proves that the proposed method can overcome the drawbacks of the conventional three-ratio method, such as no matching and failure to diagnose multi-fault. It enhances diagnosing accuracy.Keywords: DGA, extension theory, ERF, fault diagnosis power transformers, fault diagnosis, fuzzy logic
Procedia PDF Downloads 4126645 Shear Strength Evaluation of Ultra-High-Performance Concrete Flexural Members Using Adaptive Neuro-Fuzzy System
Authors: Minsu Kim, Hae-Chang Cho, Jae Hoon Chung, Inwook Heo, Kang Su Kim
Abstract:
For safe design of the UHPC flexural members, accurate estimations of their shear strengths are very important. However, since the shear strengths are significantly affected by various factors such as tensile strength of concrete, shear span to depth ratio, volume ratio of steel fiber, and steel fiber factor, the accurate estimations of their shear strengths are very challenging. In this study, therefore, the Adaptive Neuro-Fuzzy System (ANFIS), which has been widely used to solve many complex problems in engineering fields, was introduced to estimate the shear strengths of UHPC flexural members. A total of 32 experimental results has been collected from previous studies for training of the ANFIS algorithm, and the well-trained ANFIS algorithm provided good estimations on the shear strengths of the UHPC test specimens. Acknowledgement: This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning(NRF-2016R1A2B2010277).Keywords: ultra-high-performance concrete, ANFIS, shear strength, flexural member
Procedia PDF Downloads 1886644 Advancing Communication Theory in the Age of Digital Technology: Bridging the Gap Between Traditional Models and Emerging Platforms
Authors: Sidique Fofanah
Abstract:
This paper explores the intersection of traditional communication theories and modern digital technologies, analyzing how established models adapt to contemporary communication platforms. It examines the evolving nature of interpersonal, group, and mass communication within digital environments, emphasizing the role of social media, AI-driven communication tools, and virtual reality in reshaping communication paradigms. The paper also discusses the implications for future research and practice in communication studies, proposing an integrated framework that accommodates both classical and emerging theories.Keywords: communication, traditional models, emerging platforms, digital media
Procedia PDF Downloads 256643 Mathematical Modeling of Carotenoids and Polyphenols Content of Faba Beans (Vicia faba L.) during Microwave Treatments
Authors: Ridha Fethi Mechlouch, Ahlem Ayadi, Ammar Ben Brahim
Abstract:
Given the importance of the preservation of polyphenols and carotenoids during thermal processing, we attempted in this study to investigate the variation of these two parameters in faba beans during microwave treatment using different power densities (1; 2; and 3W/g), then to perform a mathematical modeling by using non-linear regression analysis to evaluate the models constants. The variation of the carotenoids and polyphenols ratio of faba beans and the models are tested to validate the experimental results. Exponential models were found to be suitable to describe the variation of caratenoid ratio (R²= 0.945, 0.927 and 0.946) for power densities (1; 2; and 3W/g) respectively, and polyphenol ratio (R²= 0.931, 0.989 and 0.982) for power densities (1; 2; and 3W/g) respectively. The effect of microwave power density Pd(W/g) on the coefficient k of models were also investigated. The coefficient is highly correlated (R² = 1) and can be expressed as a polynomial function.Keywords: microwave treatment, power density, carotenoid, polyphenol, modeling
Procedia PDF Downloads 2596642 Exchange Rate Forecasting by Econometric Models
Authors: Zahid Ahmad, Nosheen Imran, Nauman Ali, Farah Amir
Abstract:
The objective of the study is to forecast the US Dollar and Pak Rupee exchange rate by using time series models. For this purpose, daily exchange rates of US and Pakistan for the period of January 01, 2007 - June 2, 2017, are employed. The data set is divided into in sample and out of sample data set where in-sample data are used to estimate as well as forecast the models, whereas out-of-sample data set is exercised to forecast the exchange rate. The ADF test and PP test are used to make the time series stationary. To forecast the exchange rate ARIMA model and GARCH model are applied. Among the different Autoregressive Integrated Moving Average (ARIMA) models best model is selected on the basis of selection criteria. Due to the volatility clustering and ARCH effect the GARCH (1, 1) is also applied. Results of analysis showed that ARIMA (0, 1, 1 ) and GARCH (1, 1) are the most suitable models to forecast the future exchange rate. Further the GARCH (1,1) model provided the volatility with non-constant conditional variance in the exchange rate with good forecasting performance. This study is very useful for researchers, policymakers, and businesses for making decisions through accurate and timely forecasting of the exchange rate and helps them in devising their policies.Keywords: exchange rate, ARIMA, GARCH, PAK/USD
Procedia PDF Downloads 5616641 Study on Flexible Diaphragm In-Plane Model of Irregular Multi-Storey Industrial Plant
Authors: Cheng-Hao Jiang, Mu-Xuan Tao
Abstract:
The rigid diaphragm model may cause errors in the calculation of internal forces due to neglecting the in-plane deformation of the diaphragm. This paper thus studies the effects of different diaphragm in-plane models (including in-plane rigid model and in-plane flexible model) on the seismic performance of structures. Taking an actual industrial plant as an example, the seismic performance of the structure is predicted using different floor diaphragm models, and the analysis errors caused by different diaphragm in-plane models including deformation error and internal force error are calculated. Furthermore, the influence of the aspect ratio on the analysis errors is investigated. Finally, the code rationality is evaluated by assessing the analysis errors of the structure models whose floors were determined as rigid according to the code’s criterion. It is found that different floor models may cause great differences in the distribution of structural internal forces, and the current code may underestimate the influence of the floor in-plane effect.Keywords: industrial plant, diaphragm, calculating error, code rationality
Procedia PDF Downloads 1406640 Streamlining the Fuzzy Front-End and Improving the Usability of the Tools Involved
Authors: Michael N. O'Sullivan, Con Sheahan
Abstract:
Researchers have spent decades developing tools and techniques to aid teams in the new product development (NPD) process. Despite this, it is evident that there is a huge gap between their academic prevalence and their industry adoption. For the fuzzy front-end, in particular, there is a wide range of tools to choose from, including the Kano Model, the House of Quality, and many others. In fact, there are so many tools that it can often be difficult for teams to know which ones to use and how they interact with one another. Moreover, while the benefits of using these tools are obvious to industrialists, they are rarely used as they carry a learning curve that is too steep and they become too complex to manage over time. In essence, it is commonly believed that they are simply not worth the effort required to learn and use them. This research explores a streamlined process for the fuzzy front-end, assembling the most effective tools and making them accessible to everyone. The process was developed iteratively over the course of 3 years, following over 80 final year NPD teams from engineering, design, technology, and construction as they carried a product from concept through to production specification. Questionnaires, focus groups, and observations were used to understand the usability issues with the tools involved, and a human-centred design approach was adopted to produce a solution to these issues. The solution takes the form of physical toolkit, similar to a board game, which allows the team to play through an example of a new product development in order to understand the process and the tools, before using it for their own product development efforts. A complimentary website is used to enhance the physical toolkit, and it provides more examples of the tools being used, as well as deeper discussions on each of the topics, allowing teams to adapt the process to their skills, preferences and product type. Teams found the solution very useful and intuitive and experienced significantly less confusion and mistakes with the process than teams who did not use it. Those with a design background found it especially useful for the engineering principles like Quality Function Deployment, while those with an engineering or technology background found it especially useful for design and customer requirements acquisition principles, like Voice of the Customer. Products developed using the toolkit are added to the website as more examples of how it can be used, creating a loop which helps future teams understand how the toolkit can be adapted to their project, whether it be a small consumer product or a large B2B service. The toolkit unlocks the potential of these beneficial tools to those in industry, both for large, experienced teams and for inexperienced start-ups. It allows users to assess the market potential of their product concept faster and more effectively, arriving at the product design stage with technical requirements prioritized according to their customers’ needs and wants.Keywords: new product development, fuzzy front-end, usability, Kano model, quality function deployment, voice of customer
Procedia PDF Downloads 1086639 Probing Language Models for Multiple Linguistic Information
Authors: Bowen Ding, Yihao Kuang
Abstract:
In recent years, large-scale pre-trained language models have achieved state-of-the-art performance on a variety of natural language processing tasks. The word vectors produced by these language models can be viewed as dense encoded presentations of natural language that in text form. However, it is unknown how much linguistic information is encoded and how. In this paper, we construct several corresponding probing tasks for multiple linguistic information to clarify the encoding capabilities of different language models and performed a visual display. We firstly obtain word presentations in vector form from different language models, including BERT, ELMo, RoBERTa and GPT. Classifiers with a small scale of parameters and unsupervised tasks are then applied on these word vectors to discriminate their capability to encode corresponding linguistic information. The constructed probe tasks contain both semantic and syntactic aspects. The semantic aspect includes the ability of the model to understand semantic entities such as numbers, time, and characters, and the grammatical aspect includes the ability of the language model to understand grammatical structures such as dependency relationships and reference relationships. We also compare encoding capabilities of different layers in the same language model to infer how linguistic information is encoded in the model.Keywords: language models, probing task, text presentation, linguistic information
Procedia PDF Downloads 1106638 Application Difference between Cox and Logistic Regression Models
Authors: Idrissa Kayijuka
Abstract:
The logistic regression and Cox regression models (proportional hazard model) at present are being employed in the analysis of prospective epidemiologic research looking into risk factors in their application on chronic diseases. However, a theoretical relationship between the two models has been studied. By definition, Cox regression model also called Cox proportional hazard model is a procedure that is used in modeling data regarding time leading up to an event where censored cases exist. Whereas the Logistic regression model is mostly applicable in cases where the independent variables consist of numerical as well as nominal values while the resultant variable is binary (dichotomous). Arguments and findings of many researchers focused on the overview of Cox and Logistic regression models and their different applications in different areas. In this work, the analysis is done on secondary data whose source is SPSS exercise data on BREAST CANCER with a sample size of 1121 women where the main objective is to show the application difference between Cox regression model and logistic regression model based on factors that cause women to die due to breast cancer. Thus we did some analysis manually i.e. on lymph nodes status, and SPSS software helped to analyze the mentioned data. This study found out that there is an application difference between Cox and Logistic regression models which is Cox regression model is used if one wishes to analyze data which also include the follow-up time whereas Logistic regression model analyzes data without follow-up-time. Also, they have measurements of association which is different: hazard ratio and odds ratio for Cox and logistic regression models respectively. A similarity between the two models is that they are both applicable in the prediction of the upshot of a categorical variable i.e. a variable that can accommodate only a restricted number of categories. In conclusion, Cox regression model differs from logistic regression by assessing a rate instead of proportion. The two models can be applied in many other researches since they are suitable methods for analyzing data but the more recommended is the Cox, regression model.Keywords: logistic regression model, Cox regression model, survival analysis, hazard ratio
Procedia PDF Downloads 4556637 Comparison of Wake Oscillator Models to Predict Vortex-Induced Vibration of Tall Chimneys
Authors: Saba Rahman, Arvind K. Jain, S. D. Bharti, T. K. Datta
Abstract:
The present study compares the semi-empirical wake-oscillator models that are used to predict vortex-induced vibration of structures. These models include those proposed by Facchinetti, Farshidian, and Dolatabadi, and Skop and Griffin. These models combine a wake oscillator model resembling the Van der Pol oscillator model and a single degree of freedom oscillation model. In order to use these models for estimating the top displacement of chimneys, the first mode vibration of the chimneys is only considered. The modal equation of the chimney constitutes the single degree of freedom model (SDOF). The equations of the wake oscillator model and the SDOF are simultaneously solved using an iterative procedure. The empirical parameters used in the wake-oscillator models are estimated using a newly developed approach, and response is compared with experimental data, which appeared comparable. For carrying out the iterative solution, the ode solver of MATLAB is used. To carry out the comparative study, a tall concrete chimney of height 210m has been chosen with the base diameter as 28m, top diameter as 20m, and thickness as 0.3m. The responses of the chimney are also determined using the linear model proposed by E. Simiu and the deterministic model given in Eurocode. It is observed from the comparative study that the responses predicted by the Facchinetti model and the model proposed by Skop and Griffin are nearly the same, while the model proposed by Fashidian and Dolatabadi predicts a higher response. The linear model without considering the aero-elastic phenomenon provides a less response as compared to the non-linear models. Further, for large damping, the prediction of the response by the Euro code is relatively well compared to those of non-linear models.Keywords: chimney, deterministic model, van der pol, vortex-induced vibration
Procedia PDF Downloads 2216636 An Adaptive Neuro-Fuzzy Inference System (ANFIS) Modelling of Bleeding
Authors: Seyed Abbas Tabatabaei, Fereydoon Moghadas Nejad, Mohammad Saed
Abstract:
The bleeding prediction of the asphalt is one of the most complex subjects in the pavement engineering. In this paper, an Adaptive Neuro Fuzzy Inference System (ANFIS) is used for modeling the effect of important parameters on bleeding is trained and tested with the experimental results. bleeding index based on the asphalt film thickness differential as target parameter,asphalt content, temperature depth of two centemeter, heavy traffic, dust to effective binder, Marshall strength, passing 3/4 sieves, passing 3/8 sieves,passing 3/16 sieves, passing NO8, passing NO50, passing NO100, passing NO200 as input parameters. Then, we randomly divided empirical data into train and test sections in order to accomplish modeling. We instructed ANFIS network by 72 percent of empirical data. 28 percent of primary data which had been considered for testing the approprativity of the modeling were entered into ANFIS model. Results were compared by two statistical criterions (R2, RMSE) with empirical ones. Considering the results, it is obvious that our proposed modeling by ANFIS is efficient and valid and it can also be promoted to more general states.Keywords: bleeding, asphalt film thickness differential, Anfis Modeling
Procedia PDF Downloads 269